
Data Engineer
- София
- Permanent
- Full-time
In this role, you will design, build, and maintain robust, secure, and scalable data pipelines and models that power analytics, reporting, AI and operational data flows across the enterprise.At 74Software, we are modernizing our data landscape: integrating cloud data platforms, real-time data processing, advanced ELT tools, and ML/AI. This is an opportunity to work with both established enterprise systems and modern cloud-native architectures, contributing directly to high-impact projects across our global business. Responsibilities:
- Design and implement data models (e.g., dimensional, Data Vault…) to support analytical, AI and operational needs.
- Develop and optimize ELT/ETL data pipelines using SQL, Python, dbt...
- Build, schedule, and monitor data workflows using orchestration tools such as Airflow.
- Support and enhance data governance, data security, and data quality initiatives.
- Contribute to the evolution of our data lake and data warehouse (e.g., Snowflake).
- Collaborate with analysts and business stakeholders to deliver clean, trusted data for BI tools like Power BI.
- Explore and participate in projects around real-time data processing, advanced analytics, and ML/MLOps.
- Document data models, pipelines, and architecture for operational clarity and maintainability.
- Must-have:
- 3 years of experience in a similar role.
- Solid experience in database design, management, and optimization.
- Strong data modeling expertise.
- Advanced SQL skills for data extraction, transformation, and analysis.
- Proficiency in Python for scripting and data pipeline development.
- Should-have:
- Experience with a cloud data platform (e.g., Snowflake, BigQuery, Redshift).
- Understanding of data lake and data warehouse architecture.
- Familiarity with data governance, security, and data quality frameworks.
- Experience with tools such as dbt for data transformation and Airflow for orchestration.
- Knowledge of real-time data processing patterns and tools.
- Experience building and managing ELT/ETL pipelines.
- Knowledge of Data Vault methodology.
- Could-have:
- Experience with Power BI or similar analytics and visualization platforms.
- Exposure to Microsoft Fabric and Apache Spark.
- Familiarity with AWS and/or Google Cloud Platform (GCP) ecosystems.
- Interest or experience in Machine Learning and MLOps workflows.
- Join a global team shaping the future of enterprise integration and data-driven decision-making.
- Work across modern and legacy systems, combining classic data engineering with cloud-native and real-time architectures.
- Be part of a supportive, collaborative culture where your growth is a shared success.
- Contribute to high-visibility projects that directly impact business outcomes for over 11,000 customers worldwide.
- Expand your expertise in data modeling, governance, cloud platforms, and advanced analytics.