Jay Fifadra

SOFTWARE ARCHITECT

Mumbai, India

Summary

In depth Analysis of the business processes of the industry and can
provide technology consultation.
Actively seeking opportunities for web application ,website develop-
ment.
Have expertise in django using python development framework.
Have used most of the python libraries over the years.
Have Worked in a company with analytics background.
Have experience to develop application from scratch technically han-
dling full stack role for my company.
Have around 5 years of experience developing software products.
Works as an entrepreneur for own founded company which takes pro-
ject on a contract basis.
Carried out pivots as per the requirement during the product develop-
ment cycle , used advanced technologies like apache spark and worked
closely with business team.
Strong logical and analytical skills.
Strong and demonstrable problem solving/analytical skills

Languages:

English, Gujarati, Hindi, Marathi (Marāṭhī)

Favorite Python Packages:

pandas , Numpy, django-excel , beautifulsoup , scipy ,matplotlib ,requests (for api), scrapy , oauth ,Weasyprint

Experience

I have over 5 years of experience working in python and django.

Highlights of some of my key projects:

ETL PROCESS AUTOMATION FOR PRODUCT IN HR DOMAIN
1. Our Client in turn has a USA Client which has a product made in HR Domain ,Client has daily require-
ment of data transformation in specific format (preferably xml) as the client’s ingestion engine is de-
signed to accept xml format data .
2. However input data received could be in formats like csv , xlsx , json or even xml.
3. We write python scripts daily to facilitate transformations as per the client’s requests.
4. Daily challenge is to process large sets of data (few million ) , and do everyday mapping of data fields ,
have multiple source of metadata , multiple data formats across multiple files , data inter-relation across
multiple files.
5. Owing to this , we started working on the project to automate the daily task of transforming data received
in specific format to the target format.
6. We build a tool written in django (web framework written in python) to automate this task to some extent
and save a lot of time daily.
7. The tool is configurable for a particular client project covering different formats. (Different type of data
file).
8. We have some input file format (target file format including each field definition in the xsd format ).This is
the format of data that will be ingested into clients data engine (product).
9. As per this xsd file we configure the tool , Allow mapping of various tags with the input fields and in the
last step generate a python script that can be further used for carrying out further transformation
10. The core challenge of the tool is to produce relevant python script which can contain any number of data
formats.
11. Besides handling multiple data formats for a particular script the other task is to have a proper valida-
tions into the script , as it is related to data ingestion.
12. Besides this the python script which is generated by the tool has to generate the xml file format from the
input which will have various tags , these tags have relations within itself (parent-child). Care has to be
taken that only when parent is present , child is made and appended to the respective parent , and this
could be multi level tree .
13. Besides above points , there are efforts to optimize the python script generated to make data loading
faster , such that it uses less cpu.
Technologies Used : Django , Python , Various Python Libs, Js , Html , Css.

PROJECT FOR A STARTUP COMPANY (SERVING HEDGE FUNDS IN FINANCE DOMAIN)
• Phase 1
1. USA based application focused on providing a solution for searching various funds
for investment based on user query .
2. Facilitate trading system based on existing account of user with different brokers.
3. Worked on large data sets with as much as 160 columns , data loading , cleaning etc.
4. Made various apis for user workflow.
5. Made scripts for extracting data from various sources which are as large as 14k files.
6. Worked on Orient db database which is a graph database
7. Made python api wrappers over existing php wrappers
8. Shell script to download data from sftp server and cron jobs to make it periodic.
• Phase 2
1. Worked on data source from client’s vendor ,which was in xml format.
2. Extracted various data points from various xml files(few thousands) on client request.
3. Loaded the data to postgres db, after applying various validations and business logic.
4. Written various python scripts to research the data for certain parameters.
5. Establish various inter-relation within the data provided by data vendors.
6. Written scripts for monitoring the correctiveness of data coming from data vendors (Daat
Monitoring System)
7. Data Monitoring system alerts user , when there is some large change or predefined
threshould is breached ,everytime the new data comes from the data vendor.
8. Besides , there are derived data obtained from raw data , eg calculating returns for various
entities.
9. Arranging a cyclic dump of data from db , and auto removal within 15 days using a mix of
python and shell script.
• Technologies Used : Django , Python , Php , Web Services, Cron Tools Used

PRODUCT IN RETAIL SECTOR (DATA DRIVEN MARKETING PRODUCT)
1.Essayed a critical role in the development of the various modules of the product called customer
sense for retail clients.
2.Carried couple of pivots in advance technologies like apache spark as per the requirement of the
product.
3.Produced various types of graph in D3 , C3 based on the data .
4.Have followed agile methodology for development.
5.Modules includes various Analysis of customers data , based on which customer segmentation
report is prepared for client, further a marketing campaign decision is made.
6.Other Module includes One time Buyer, Customer Churn Module and other play around with cli-
ent’s customer data .
7.Managing various vendor data for integration within the product.
Technologies Includes : Django , Python , Postgresql, Apache Pyspark Html , Css , Javascript , Jquery , D3js, C3js

PYTHON BOT IN CRYPTO-CURRENCY
1.Custom Strategy defined by the client for trading in crypto exchange using auto trading bot.
2.Bot is made for auto buy and sell signals through web service to respective exchanges .
3.Web Scrapped few twitter bots that tells the best buy sell and calculate the returns out of it
4.We are into Non-disclosure Agreement further.
Technologies Used : Python-Django, Celery , Scrappy

REAL ESTATE CRM
1. Features like auto payment reminder , email, sms to respective parties (home buyers) .
2. Payment reminder is calculated based on installments generated and payment made by
the buyer.
3. Interest is auto calculated based on the custom logic
4. Key part is it is deployed locally at client site
5. Django application is served by IIS 10
6. Technologies Used : python-Django ,Html , css ,js , bootstrap, IIS

APPLICATION DEPLOYMENT ON CLOUD
1. Have been using Digital Ocean Ubuntu Server regularly for django application deployment.
2. Setting up nginx as web server , gunicorn using supervisor as application server , postgres as da-
tabase in the digital ocean droplet.
3. Also used various products of amazon web service for various deployment of application.

I AM EXPLORING
1. Various algorithm to empower data science with the data available in finance sector, some
machine learning techniques.
2. Exploring various AWS products , with some of the products already known.

 

Skills

Amazon Web Services (AWS), Backend Development, Big Data, Bitcoin, Celery, Cryptocurrency, Data Science, DevOps, Django, Django REST Framework, Frontend Development, Fullstack Development, Git, Gunicorn, JavaScript, Linux, MongoDB, MySQL, Nginx, NumPy, Pandas, PostgreSQL, RabbitMQ, SciPy, Spark, Virtualenv, Visualization, Web Development, Web Scraping, jQuery

Joined: January 2019