π΄ Work format: full-time, 100% remote π
β° Start: ASAP π
Hi! π
We are looking for Azure DataBricks Engineers for our US-based client. The work involves areas such as migration, data collection, and optimization of solutions based on DataBricks. The client has a continuous demand for specialists. The projects they run are mostly short-term (with a high probability of extension), and due to this constant demand, the client can usually offer a new project right after the previous one ends.
Currently, specialists are needed for 3 projects:
-
two projects focused on migrations to DataBricks (a marketing platform and a fundraising platform)
-
one project focused on building a medical data analytics platform and embedding it into the Databricks environment
For the client, it is crucial to have strong experience with Azure (and/or) AWS, as well as solid knowledge of DataBricks and Apache Spark. The client mainly works with US-based companies β in most cases, only a small time-zone overlap is required (e.g., 10:00β18:00 CET), though we are open to candidates preferring different working hours.
General responsibilities:
π Planning tasks and selecting appropriate tools
π Integration of databases in near real-time
π Designing and developing ETL processes
π Conducting migrations of databases/platforms/ML models
π Optimization and automation of platforms
π Close cooperation with data engineers, data scientists, and architects
Requirements:
β‘οΈ Solid experience working as a data engineer or in a related role (8+ years)
β‘οΈ Strong knowledge (min. 2-3 years of experience) of the DataBricks platform and Apache Spark (migrations, ETL processes, integrations)
β‘οΈ Strong Python skills
β‘οΈ Experience with data migrations
β‘οΈ Experience working in Microsoft Azure (e.g., Data Factory, Synapse, Logic Apps, Data Lake) or/ and AWS Cloud (e.g. Redshift, Athena, Glue)
β‘οΈ Strong interpersonal and teamwork skills
β‘οΈ Initiative-taking and ability to work independently
β‘οΈ English at a level enabling fluent team communication
Nice to have:
β‘οΈ Experience in designing and optimizing data workflows using DBT, SSIS, TimeXtender, or similar (ETL/ELT)
β‘οΈ Experience with any big data or noSQL platforms (Redshift, Hadoop, EMR, Google Data, etc.)
How we work and what we offer:
π― We value open communication throughout the recruitment process and after hiring β clarity about the process and employment terms is important to us
π― We keep recruitment simple and human β our processes are as straightforward and candidate-friendly as possible
π― We follow a "remote first" approach, so remote work is our standard, and business travel is kept to a minimum
π― We offer private medical care (Medicover) and a Multisport card for contractors
Sponsored
Explore Engineering
Skills in this job
People also search for
Similar Jobs
More jobs at Crodu
Apply for this position
Sign In to ApplyAbout Crodu
We're looking for great people to join our growing team.