Kafka (nice to have)
Hadoop (nice to have)
ETL tools (regular)
Data Integration (regular)
We are Addepto, where you can feel a startup atmosphere! We believe that the only constant in life is change, so we try to keep developing and improving to become better at what we do every day! We act outside the box and create and deliver the best solutions in the area of Big Data , AI, and Business Intelligence .
Our team based in Warsaw and remotely is looking for a Senior Data Engineer focusing mainly on designing and constructing data processing architecture.
We are open for candidates with different expertise levels who want to develop further their skills and experience in this role.
Some of our recent Big Data projects :
Data lakes which stores terabyte of data and process machine learning tasks for big telecom company
Streaming applications to server data analytics in real-time for manufacturing companies
Systems that support the decision-making process and help to analyze data in a unified format for controlling and operations departments
Support real-time machine learning prediction on massive datasets, which prevents company losses for pharmaceutical companies
What we offer :
Work in well-coordinated team enthusiasts : Big Data & AI Engineer
Fast career path and the possibility of further development in many areas
Challenging projects with the latest technology stack for global clients
Flexible working hours and the possibility of remote work
Any form of employment
Other benefits (e.g. team-building events, training budget, medical & sports package, and others)
Design and construction of scalable data processing architecture
Design, build and deploy effective data ingestion pipelines / streams in StreamSets Data Collector, or Kafka
Making an application that will aggregate, process, and analyze data from various sources
Cooperation with the Data Science department in the field of Machine Learning projects (including text / image analysis and building predictive models)
Using Big Data and BI technologies (e.g. Spark, Kafka, Hadoop, SQL)
Manage distributed database systems like ClickHouse, BQ, Teradata, Oracle Exadata, PostgreSQL + Citus
Modelling, Star and Snowflake schema
Develop and organize data transformations in DBT and Apache Airflow
Translate requirements from the business and translate them into technical code
Ensure the best possible performance and quality in the packages
Manage business user’s expectations
Higher education in technical and mathematical studies (or the last year of studies)
Commercial experience in the implementation, development, or maintenance of Big Data or Business Intelligence systems
Knowledge of Java / Scala / Python
Experience in SQL
Good command of the English language (min. B2)
Independence and responsibility for delivering a solution
Excellent knowledge of Dimensional Data
Experience forming and maintaining relationships with Senior Leaders and external stakeholders
Good communication and soft skills
Lead discussions, and requirement sessions, should be able to comprehend, summarize and finalize the requirements.
Knowledge of the aviation industry and aviation solutions is preferable.
Nice to have experience in Spark, NiFi, Docker, AWS or Azure, Splunk
Are you interested in Addepto and would like to join us?
Get in touch! We are looking forward to receiving your application. Would you like to know more about us?
Visit our website ( career page ) and social media ( Facebook ,