Python Experience
Python Experience
I have 6+ years experience of building Python based software and 4+ years' experience
working with frameworks like Django, Django Rest Framework.
As a Python enthusiast, I am well versed in multi tasking, data scraping, analysis &
visualization and real time processing.
And have reasonable experience of building RESTful APIs using Django Rest Framework.
Also familiar with boto3, numpy and pandas packages.
For ORM, I usually use Django ORM or SqlAlchemy.
Let me introduce my past Python related projects.
1. IoT Toolkit & Dashboard (2 years contract)
Summary
This toolkit delivers real time in-store data that empowers you to make data-driven
decisions.
Role
I participate in this project as a Full Stack developer with one other mechanical engineer
and 2 QA engineers.
I architect the whole system, write back-end, front-end and simple deployment script.
For back-end, I write RESTful API layer for user authentication & authorization, data
visualization. And write a service layer for interacting with DynamoDB partitioned by
sensor. In the service layer, I calculate data for reports.
For front-end, build as a Single Page Application with Angular 8 and D3.js
Stacks
Python Django Rest Framework Amazon DynamoDB AWS Lambda Numpy Pandas
Angular D3
2. RealTime Packet Analysis & Report System (6 months contract)
Summary
This project is developed as an internal project to provide base statistics for domain
buyers & sellers.
Role
I participate in this project as a lead developer. I lead one other engineer. The most
challenging part is to process about 50GB pcap file data every day.
First I write the AWS Lambda function and trigger it when a new PCAP file is uploaded
to S3.(PCAP file is the dump file of packets captured on a network device for a certain
period of time. And the average file size is about 100MB)
For Lambda, I write it using Python. The main issue of this solution is the total time of
processing one file. For processing one file, it takes over 15 minutes but the run time
limit of AWS Lambda is 15 minutes. So with this solution, we couldn’t process 50 GB
data in a day. I decided to adopt a different approach. As the client asked me to architect
as a AWS Serverless Application, I reported this limitation to the client. He understood
and agreed with my Message Queue System idea.
I design the Message Queue Processing system with AWS Lambda, RabbitMQ, S3 and
AWS Athena. When a new file is added to S3, the Lambda enqueues the message with
file name. And the consumers dequeue and analyze the uploaded S3 file.
The actual consumers are running as a thread on Amazon EC2 instances, so it doesn’t
have any limit for run time. I launch 16 concurrent consumers at the same time. By doing
that, I could process all data real time without any delay.
Stacks
Python AWS Lambda S3 Athena RabbitMQ Sqlite3 TShark
3. Telegram Bot (1 year contract)
Summary
Bot for telegram to guard the group from scammers. It prevents the scammers from
joining, supporting the owner or admin to ban or unban group members. Also track
users’ message and status
Role
I participate in this project as a lead / back-end developer, lead two other developers and
QA engineer. I write a real time telegram message capture sub system, command
dispatch sub system.
Stacks
Python MySQL Celery RabbitMQ
4. ECommerce Bot & Management System (2 year contract)
Summary
This platform provides an intuitive graphical interface to define conversations (BPMN).
And saves data in the cloud in Switzerland. Also provide a SaaS model or install on the
user's infrastructure.
Role
I participate in this project as a back-end developer with 3 other developers and 2 QA
engineers. I write Telegram & Facebook bots, bot management part and build RESTful
API using Flask to interact with registered bots
Stacks
Python Django Flask PostgreSQL Celery RabbitMQ VueJS