HomeJobAzure Data Engineer

Azure Data Engineer

Job Category: Azure Data Engineer
Job Location: Noida Bangalore Indore

“Mandatory Skills:
• Excellent knowledge and experience in Azure Data Services (like ADF, etc.) along with Databricks.
• Good knowledge and expertise in SQL, Python, PySpark.
• Good understanding of ETL & Data warehousing concepts.
• Excellent written & spoken communication skills, ability to interact with customer & relevant stakeholders.
• Ability to understand the provided requirements and architecture and be able to translate it into the implementation, following best practices.
• Self-driven and excellent learning attitude.
• Knowledge and exposure to Databricks.
• Experience in Snowflake Datawarehouse.
Good to have:
• Experience & conceptual knowledge of Bigdata technologies – Hadoop, HDFS, Hive, Spark.
• Knowledge and exposure to Databricks.
• Experience on ETL tools (like Matillion, DBT, etc.).
• Exposure to other cloud platforms.
Roles & Responsibilities
Azure Data Engineers with 3 to 8 years of experience.
• The Data Engineer is part of Information Technology responsible for supporting data and analytics solutions across projects.
• This individual will collaborate with various projects and create data solutions for variety of business use cases.
• Drive innovation within Data Engineering by playing a key role in technology for the future of our data and analytics.
• Critical team member in the design and development of highly complex data projects
• Identify gaps and weaknesses in our data stack and continue to guide, learning advancements for the team.
• Provide technical expertise to teams in all phases of work including analysis, design, development, and implement cutting-edge solutions.
• Negotiate and influence changes outside of the team that continuously shape and improve the Data.
• Understand the UI needs, backend design, and data design and create the backend code for the same.
• Running diagnostic tests, repairing defects, and providing technical support.
• Documenting Node.js processes, including database schemas, as well as preparing reports.
• Design and implement APIs and integrate with third-party APIs as needed.
• Develop and maintain databases using technologies such as MongoDB or MySQL
• Create reusable code and libraries for future use.- Optimize applications for maximum speed and scalability.
• Troubleshoot and debug issues, and provide timely resolutions.

Primary Skills:
Azure Data Services
SQL, Python, PySpark
ETL & Data warehousing
Knowledge and exposure to Databricks.
Experience in Snowflake Datawarehouse.

No of Position: 2
Experience Range: 3-8 years
Budget: 18 USD

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
  • Home
  • About Us
  • Services
  • Industries
  • Career
  • Our Presence
  • Our Clients