Contact
Arek Kulczycki
Senior Python Developer
-github.com/arekkulczycki
Profile
Fundamentals – I code in Python 6 years professionally
Passion – Currently as a hobby I’m developing a chess
Django framework.
OpenAI, Tensorflow) and highly optimized Python.
and over 10 years privately and I’m proficient with
playing engine based on Machine Learning (PyTorch,
Team player – Working last 4 years in Agile culture
Honesty and transparency – I like to be straight to the
and mutual reviews are my core values in order to
can count on my sincere opinion and thoughts in the
I find it my favourite way of management. Cooperation
develop and deliver quality products.
point and base business relations on mutual trust. You
recruitment process as well as if we get a chance to
Flexibility – Across many projects I developed using:
work together. I’ll spare you reading more tools and
such as GitHub, Bitbucket, Jenkins, Ansible. Deployed
already bolded above, but feel free to ask me in person.
Kotlin, Java, Go, C#, PHP, NodeJS. Used CI/CD tools
libraries I’m familiar with, having a bunch of them
production solutions to cloud (AWS, Heroku), or
dedicated servers (Nginx, Apache, Docker).
Experience
Skillset
04.2021 - 02.2024
Backend developer | Everli S.p.a.
Languages & Frameworks
Working with data and system integrations. Responsible for constant
automation and optimization of processes in a microservice
architecture.
03.2018 - 03-2021
Backend & Android developer | Order Group Sp. z o.o.
• Python: 7 years
• Django: 4 years
• Kotlin: 2 years
• Android: 2 years
• C#: 2 years
• .NET: 2 years
• NodeJS: 2 year
• SQL: 9 years
Most notably basing the work around Django framework. Across
various projects I developed a native Android app, maintained a
multiplatform Xamarin app, developed Unity and Web applications
with real-time websocket communication between them.
Databases
09.2014 - 03.2017
Full-stack developer | Business-Net Sp. z o.o.
Git, Docker, Bitbucket, Jenkins, Nginx,
AWS, Terraform/Terragrunt, RabbitMQ
Responsible for maintainance and adding features to a web
application, a CRM system, based on .NET WebForms and JQuery.
Practices
07.2011 - 08.2011
Internship | AVET Information and Network Security Sp. z o.o.
PostgreSQL, MySQL, MS SQL, SQLite,
Redis, DynamoDB
DevOPs
Agile, Jira, YouTrack
Other
Unity3D, GStreamer, OpenCV
Designing an interface to simulate responses from a router device,
for security breaches tests.
Diploma
Bachelor degree in IT and Econometrics at University of Warsaw
I agree to the processing of personal data provided in this document for realising the recruitment process pursuant to the Personal Data Protection Act of 10 May 2018 (Journal
of Laws 2018, item 1000) and in agreement with Regulation (EU) 2016/679 of the Euroean Parliament and of the Council of 27 April 2016 on the protection of natural persons with
regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation).
Experience
Order Group
In my time in this vigorous software house I had a chance to participate in development of many
different products that we started from scratch and delivered within several months or quarters.
Among them:
_Mobile airspace navigation
A product used by pilots of helicopters and small airplanes as a navigation aid. In this project
my responsibility was mostly around collecting spatial data updates transmitted by European
aeronautical organization. The backend created optimized bundles for a mobile app to fetch
and display a map of 3d airspace segmented with zones - each with updated information about
it’s type, availability etc - in order to plan a flight path omitting forbidden areas at the correct
altitudes.
I also briefly worked at the mobile application level in order to optimize parsing data structures.
Stack
•
python (django)
•
postgresql
•
celery
_Renewable energy powerplant control system
My team put up a cloud serverless architecture in AWS to create a PoC of a remotely controlled
powerplant. Work included configuring a realtime communication with physical devices and
logging the state of the system in a web panel. Data flows included several services, most crucially:
Lambda, DynamoDB, IoT, Cognito. One of my responsabilities was to manage the architecture
with IaaC using Terraform.
Additionally, at the powerplant security level, I developed a surveillance system that detected
movement as well as high temperature leaks, using traditional and infrared cameras and object
detection algorithms, in order to alarm about dangers in the powerplant.
Stack
•
cloud (aws)
•
python
•
•
terraform
docker
_Mobile barcode tracker
In this project I developed both an Android application and the backend side for it to connect to.
The app was a control tool for warehouses. Tho objective was to read barcodes, validate them
and collect into organized groups of scanned products, then to register which goods have been
sold or transported. The data on the backend was then periodically synchronized with an external
system.
Stack
•
kotlin (android)
•
postgresql
•
python (django)
Everli
I’m working with a team responsible for integrating current and new retailers’ data into our
system. This involves parsing and validating daily data input about products from thousands of
physical stores. Several layers of processing, in a microservices ecosystem, produce a constantly
up-to-date structured output into our app. The activities around that include: defining requirements
for external connections and data structures, automatization of internal flows, maintenance,
designing technical solutions, scoping new ideas for optimizations and new internal features. Apart
of that I also help to gradually port pieces of our pipeline into AWS. Some of the features I was an
owner of:
Stack
•
python
•
mysql
•
•
•
•
cloud (aws)
redis
rabbitmq
php (laravel)
_Incremental parsing
_Progress tracker
In order to optimize the flow I refactored an ingestion piece to
identify where subsequently the same data was being parsed
multiple times. To cover for those situations I had to implement
a cache that kept track of relation between raw and processed
data. In the optimized flow only the “new” part of data was
processed as usual, but to be at the end merged with a proper
“known” dataset pulled from cache.
In order to systematize and track the progress of integration
with a retailer, I constructed a tool for internal users of our
system responsible for overseeing the entire process. The
difficulty in this tool was to represent, within an interactive view,
relations between certain steps along the way. The relations
helped navigate the entire process and enabled or disabled
some actions based on the status of others, as well as the
overall status.