PAVAN KUMAR TUMMALAPALLI

Python Django Flask Developer

Hyderabad, India

Summary

Languages:

English, Hindi, Telugu

Experience

Accion Labs India Private Limited September 2017 - March 2019 (1 year 7 months)

Senior Software Engineer. Bengaluru Area, India

Designed ETL pipeline, which has a collection of microservices where each microservice loads the files from various datasources, transforms and finally send to other data sources. The data sources include Apache Kafka, Azure Service Bus, Azure storage, API sources.

  • Created a shared framework for multiple consumer and producer clients, which has abstract consume and send methods, where internally each data source client has their own implementations.
  • Created a real-time inbound client – worker(which has business logic) – outbound client pattern framework, where each microservice will only need to add the business logic to this framework. Additionally inbound and outbound client settings need to be provided.
  • Created custom Kafka logging handler where each and every log is sent to Kafka for storage and further uses.
  • Created log processors which consume the logs form Kafka and ingested to elasticsearch search engine, and thereby analysing the logs in kabana.
  • Added Timers and Metrics for our services by storing them into Influxdb.
  • Created graphs in Grafana with influx datasource to measure our services performances.

Softwares used: Apache Kafka , Azure service Bus, Azure Storage, Amazon s3, elasticsearch, Influxdb, kibana, grafana.


Flipkart Internet Private Limited Feb 2015 - July 2017 (2 years 6 months)

Product Solution Engineer II . Bengaluru Area, India

Developed Flipkart In-house Advertising WSR self serve console,where internal stakeholders used to create the advertising campaigns for WSR.

Technologies used: Flask, JavaScript.

Designed and developed Advertising analytics Bigdata pipeline which is used to analyse the distribution and classification probabilities of search queries.

Technologies used: Ozzie Workflow, Hive, Shell script

Advertising Metrics – Built Datamodels to visualize Advertising metrics in kibana using Hadoop and Elasticsearch Technologies used: Java, Map Reduce, Hive. Biz Metrics Alert Detection - Developed a separate self-service alert system for Merchandising and Monitoring, where alerts are generated as per metrics thresholds.

Technologies used: Java, Quartz Scheduler, Druid API.


Genome Life Sciences July 2014 - January 2015 (7 months)

software developer . Chennai Area, India

DNA Analysis has various stages where some of them are independent and some of them are dependent. Many of them are long continuing process which may take days to complete.

I was involved in code modification and optimisation of existing services for running them in cluster environment in both normal and Amazon cluster.

Written both python and Perl scripts for both job scheduling with slurm and sun grid scheduler
Configured Amazon EC2 instances, created Amazon cluster with star cluster, MySQL RDS instance, s3 multipart upload with python.

Configured Amazon Ec2 load balancing in cluster environment.

Written shell scripts for tools installation in Amazon and normal servers.

Technologies used: python boto, Perl Net::, Amazon::EC2 modules for working on Amazonaws.


HCL Technologies June 2013 - July 2014 (1 year 2 months)

Software Engineer Hyderabad Area, India

Job Workflow is a grid computing computer cluster software system made up of a heterogeneous collection of nodes to serve the diverse workload, which load balance and share system resources effectively or achieve a target quality of service. Jobs may be interactive jobs or batch system jobs. All these jobs are assigned by a scheduler, where it allocates resources in an optimized way. All jobs can be monitored with their details like user, submission time, job id, running time, resources allocated etc. The system also provides Parallel computation in which many calculations are carried out simultaneously, operating on the principle that large problems can often be divided into smaller ones, which are then solved concurrently ("in parallel") on different nodes and finally merge the solution. Roles and Responsibilities: • Designing Scheduler system by calculating load threshold for each CPU and in clusters. • Designed scheduling algorithms like Round robin, Fair-share, and Fixed priority pre-emptive types.

Designed all logs and storage of jobs details, which will be stored in SQLite database. • Prepared test cases for running different types of jobs with the system in Cluster Environment.

Technologies used: mpi4py, subprocess, multiprocessing, PAM specific modules for parallel computing. •


REBUS SOFT-SYS PRIVATE LIMITED January 2011 - November 2011 (11 months)

Software Developer Hyderabad Area, India

IAMCS (Integrated Appliances Management and Control System) is a System Product which may include centralised control of lighting, appliances, security locks of gates and doors and other systems, to provide improved convenience, comfort, energy efficiency and security. Devices are connected through an X10 interface controller which are automated periodically and also at specific times.

Responsibilities

  • I was involved in writing Python and Perl scripts for testing automation of devices.
  • Used modules like Linux Serial port, pyusb and x10-cm17 for interacting the x10 controller.
  • Data validation/Reconciliation report generation.
  • Involved in peer to peer code reviews.
  • Code validation as per client requirement.
Technologies used: Python, Linux Centos, Ubuntu, SSH.

Skills

Backend Development, Big Data, Celery, Django, Django REST Framework, Docker, Elasticsearch, Fabric, Flask, Git, JavaScript, Jenkins, Linux, MongoDB, MySQL, Nginx, NumPy, Pandas, PostgreSQL, RabbitMQ, ReactJS, Redis, Scrapy, Spark, Virtualenv, jQuery

Joined: April 2019