Data Masters logo

Data Masters

Stefan logo

Stefan

Available now

Data Engineer @ Data Masters

Mid-level
Seniority
GMT+01:00, Skopje, Macedonia
Location & Timezone
$50 - $60/hr
Average Hourly Rate
Macedonian, English
Languages

Top skills

Microsoft Fabric
PySpark
Microsoft Azure
SQL
Databricks

About

Based in North Macedonia, this Data Engineer brings 5+ years of expertise in data engineering and BI. He is specialized in building scalable pipelines and Lakehouse models using Apache NiFi, Airflow, Spark, and Azure/Microsoft Fabric. With a focus on internal analytics products and self-service cloud platforms, they excel at integrating heterogeneous data. Dedicated to ELT/ETL best practices, they ensure data reliability for global analytics.

Top skills

Verified by Pangea.ai due diligence

Top SkillsCurrent UsageSeniority
Microsoft Fabric
10%2 years
PySpark
10%2 years
Microsoft Azure
20%1 year
SQL
50%7 years
Databricks
10%1 year

All skills

Roles and tools, to bring ideas to life and create meaningful experiences

Databases
Architecture
API
Data Analytics and Visualization

Professional experience

Explore a curated selection of projects highlighting Stefan's expertise and experience. Each project aims to showcase challenges, solutions, and the final outcome, along with the tools and technologies used.

Under MNDA
Internal Product (Fabric) — Data Engineer
Implementation of ingestion and orchestration workflows as part of a large-scale data platform modernization initiative, moving from legacy systems toward a scalable, cloud-ready architecture. Built robust data pipelines using Apache NiFi to ingest data from heterogeneous source systems into a central data lake and warehouse. Developed and orchestrated Spark / PySpark transformations via Apache Airflow, enabling modular, reusable ELT processes that can be lifted to cloud platforms such as Azure and Fabric. Designed and implemented Data Vault-based models to ensure flexibility, auditability, and long-term maintainability of the platform.
Data Engineering
Big Data & Analytics
Data Visualization
Database Design & Management
Apache NiFi
+3

Under MNDA
Telecom Analytics Project — Business Intelligence Developer
Designed and developed reporting and analytics solutions for telecom business stakeholders, supporting marketing, sales, and operational use cases. Implemented ETL processes and data transformations using SQL / PL/SQL on top of Oracle-based systems, with an eye toward future migration to cloud/Fabric-based platforms. Created dashboards and KPI reports (via SQL and BI tools) to support decision-making in commercial and operational domains. Worked closely with business users to define metrics, validate calculations, and improve data quality across reports.
Data Analytics and Visualization
Big Data & Analytics
Database Design & Management
Data Engineering
Data Visualization
+4

Under MNDA
Enterprise Data Platform Modernization — Data Engineer
Implementation of ingestion and orchestration workflows as part of a large-scale data platform modernization initiative, moving from legacy systems toward a scalable, cloud-ready architecture. Built robust data pipelines using Apache NiFi to ingest data from heterogeneous source systems into a central data lake and warehouse. Developed and orchestrated Spark / PySpark transformations via Apache Airflow, enabling modular, reusable ELT processes that can be lifted to cloud platforms such as Azure and Fabric. Designed and implemented Data Vault-based models to ensure flexibility, auditability, and long-term maintainability of the platform.
Cloud Services
Digital Transformation
Infrastructure & DevOps
Architecture
Cloud Computing
+6

Preferred tools

View the preferred tools and apps used by Stefan to assess compatibility and alignment.

Jira
Jira
Git
Git
Gitlab
Gitlab
Github
Github
Azure
Azure
Microsoft
Microsoft
Oracle PL/SQL
Oracle PL/SQL

Career highlights

Discover Stefan’s professional journey, including employment history, certifications, and educational background.

Data Engineer
Data Masters2020 - Present
Develop and maintain scalable batch data pipelines using Apache NiFi and other orchestration tools for automated ingestion. Orchestrate ETL and ELT workflows with Apache Airflow, ensuring robustness and reliability for daily operations. Build Spark and PySpark applications to transform and aggregate large data volumes for analytics. Design data warehouse and Lakehouse models using dimensional modeling and Data Vault, in both on-prem and cloud-ready contexts. Collaborate with analysts, engineers, and business teams to translate requirements into technical solutions that are ready for cloud deployment. Implement data quality checks, validation steps, and alerting mechanisms within pipelines to maintain trust in data.
Data Engineer
Data Masters2020 - Present
Developing an internal analytics product on Microsoft Fabric, designing Lakehouse-based data models, orchestrating pipelines, and enabling self-service analytics in the cloud. Experience with both batch and real-time processing, integrating data from heterogeneous enterprise systems into cloud-ready data platforms. Focused on meeting and exceeding client expectations by applying best practices in data modeling, ELT/ETL, and cloud architectures, ensuring data usability and reliability for analytics and reporting.
Data Engineer
Data Masters2023 - 2025
Designed and implemented data ingestion and transformation pipelines on Microsoft Fabric, leveraging Lakehouse architecture and Pipelines. Integrating data from multiple enterprise systems into OneLake / Fabric Lakehouse to create a unified analytical layer for internal products. Optimizing query performance, storage layout (files/partitions), and pipeline scheduling with a strong focus on cloud cost efficiency and scalability. Collaborating with product owners and analysts to design business-friendly semantic models that can be served directly to Power BI and self-service users. Applying cloud-native best practices around security, monitoring, and reliability in Azure/Fabric environments.
Business Intelligence Developer
Data Masters2021 - Present
Designed and developed reporting and analytics solutions for telecom business stakeholders, supporting marketing, sales, and operational use cases. Implemented ETL processes and data transformations using SQL / PL/SQL on top of Oracle-based systems, with an eye toward future migration to cloud/Fabric-based platforms. Created dashboards and KPI reports (via SQL and BI tools) to support decision-making in commercial and operational domains. Worked closely with business users to define metrics, validate calculations, and improve data quality across reports.

Testimonials

Maya

Deutsche Telekom

Verified Testimonial

Stefan was instrumental in transforming our telecom data into actionable insights. He didn't just manage our legacy Oracle systems; he bridged the gap between complex SQL and our actual business needs for sales and marketing. By validating our metrics and improving data quality, he gave us the certainty we needed to trust our KPIs. He is a senior expert who truly understands how to empower people through reliable data.

Similar talent

Slide 1 of 0