March 2025 - Data Engineer
Description
We are looking for a Data Engineer to design, build, and scale a robust data infrastructure. In this role, you will play a key part in shaping the future of our data ecosystem, ensuring it is scalable, efficient, and designed to support high-impact decision-making. You will combine technical expertise with strategic leadership, collaborating across teams to elevate data capabilities and deliver exceptional value.
Contact
careers@codilas.comSalary range
4,000 – 6,000 EUR (Brutto I)
Your Role and Contributions
- Data Pipeline Architecture & Development. Design and architect complex data pipelines and ETL processes using distributed computing frameworks.
- Data Lakehouse & Storage Optimization. Develop and maintain data lakehouse leveraging modern data lake technologies for large-scale data processing.
- Cloud Data Infrastructure. Implement and optimize data ingestion, transformation, and storage strategies across multiple cloud platforms (AWS, Azure, GCP).
- Data Engineering Best Practices. Establish and enforce performance optimization, data quality, and governance standards for engineering workflows.
- Advanced Analytics & Machine Learning Integration. Collaborate with data science and analytics teams to support machine learning models and notebook-based data workflows.
- Scalability & Performance Optimization. Ensure high availability, fault tolerance, and optimized performance of data infrastructure at scale.
- Team Collaboration & Mentorship. Conduct code reviews, mentor junior engineers, and contribute to cross-functional team projects to drive data excellence.
Required Skills & Qualifications
- 6+ years of professional experience in data engineering, including 2+ years in a technical leadership role
- Expertise in big data technologies (e.g., Apache Spark, Kafka, distributed computing frameworks)
- Proficiency in cloud platforms (AWS EMR, Glue, Databricks, data lake/lakehouse solutions)
- Strong skills in Python, SQL, and modern data engineering frameworks (dbt, Airflow)
- Experience with containerization and orchestration (Docker, Kubernetes)
- Proven track record of delivering scalable, resilient data pipelines with strong architectural design capabilities
- Experience with advanced analytics workflows, supporting machine learning models and notebook-based data science
- Ability to collaborate with cross-functional stakeholders, translating technical challenges into business solutions
- Familiarity with CI/CD pipelines and data governance best practices
- Ability to work in a US-timezone when required
Nice-to-Have (Not Required)
- AWS cloud platform certifications (AWS Certified Data Analytics, AWS Solutions Architect)
- Databricks certifications (Databricks Certified Data Engineer)
- Strong experience in IAM and security configuration for cloud environments
- Knowledge of CloudFormation or Terraform for infrastructure as code
- Experience with real-time data processing and streaming technologies (Kafka, Event Hub)
- Understanding of subscription-based business models and related data challenges
- Experience in implementing data privacy and security measures at scale
Selection process
- Share your CV and any relevant work samples.
- Submit CV
- A single, relaxed conversation where we get to know each other, discuss our culture, and assess your technical expertise.
- Interview
- If we're a good match, we'll extend a formal offer.
- Offer
What do we offer?
Flexible Working Hours
Manage your own schedule to maintain a healthy work-life balance.
Global Exposure
Opportunities to travel and immerse yourself in diverse cultures, expanding your perspectives.
Hybrid Work Environment
Work remotely, onsite, or mix it up—whatever works best for you.
Top-Tier Equipment
We provide state-of-the-art tools and resources to help you excel.
Continuous Learning
Access to educational resources, training, and professional development opportunities.
Conference Support
We sponsor attendance at industry-leading events, helping you stay on top of industry trends.
Performance Bonuses
Receive mid-year and year-end bonuses based on productivity and contributions.