Work With Us

Azure Data Engineer

Interview Rounds: 4


As an Azure Data Engineer, your role will be pivotal in shaping our data architecture and processes. Key responsibilities include:

  • Demonstrate proficiency in working with Azure Data Factory, Azure Databricks, and Snowflake.
  • Utilize PySpark and SQL for data engineering and transformation tasks.
  • Analyze data sources and attributes, ensuring a deep understanding of data characteristics.
  • Implement ETL processes to extract, transform, and load data efficiently.
  • Utilize PySpark and other relevant tools for effective ETL operations.
  • Implement processes to ensure data quality and integrity throughout the data pipeline.
  • Identify and address any anomalies or discrepancies in the data.
  • Collaborate with cross-functional teams to understand data requirements.
  • Document data engineering processes, ensuring transparency and knowledge transfer.
  • Optimize data engineering processes for efficiency and performance.
  • Monitor and enhance the performance of data pipelines.*
  • Identify and troubleshoot issues in data engineering processes.
  • Implement effective solutions to address challenges and bottlenecks.
  • Ensure data security and compliance with relevant regulations.
  • Implement best practices for data protection and privacy.
  • Stay updated on the latest trends and technologies in Azure data engineering.
  • Apply continuous learning to enhance skills and contribute to evolving data practices.
  • Collaborate with data scientists, analysts, and other stakeholders to address complex data challenges.
  • Provide technical expertise in solving data-related problems.
  • Execute data engineering tasks within the context of larger projects.
  • Contribute to project planning and execution, ensuring timely delivery.
  • Effectively communicate technical concepts to both technical and non-technical stakeholders.
  • Collaborate with team members through clear and concise communication.

Minimum Education:

  • Bachelor’s degree in computer science or equivalent.


  • Experience working with Azure Data factory, Azure Databricks, PySpark and Snowflake
  • Pyspark, SQL, ETL
  • Analyze the data sources and data attributes.
IT Services
United States