Job Details
This role will be responsible for creating Data orchestration with Azure Data Factory Pipelines & Dataflows Key role is to understand the business requirements and implement the requirements using Azure Data Factory Roles & Responsibilities : - Understand business requirement and actively provide inputs from Data perspective - Understand the underlying data and flow of data - Build simple to complex pipelines & dataflows - Should be able to implement modules that has security and authorization frameworks.
- Recognize and adapt to the changes in processes as the project evolves in and function- Expert level knowledge on Azure Data Factory - Advance knowledge of Azure SQL DB & Synapse Analytics, Power BI, T-SQL, Logic Apps , Function Apps - Should be able to analyze and understand complex data - Monitoring day to day Data factory pipeline activity - Knowledge of Azure data lake is required and Azure Services like Analysis Service, SQL Databases, Azure DevOps, CI/CD is a must - Knowledge of master data management, data warehousing and business intelligence architecture - Experience in data modeling and database design with excellent knowledge of SQL Server best practices - Excellent interpersonal/communication skills (both oral/written) with the ability to communicate at various levels with clarity & precision.
- Should have clear understanding of DW lifecycle and contribute to preparing Design documents, Unit Test plans, Code review reports - Experience working in Agile environment (Scrum, Lean, Kanban) is a plus - Knowledge of Big data technologies - Spark Framework, NoSQL, Azure DataBricks , Python Qualifications & Experience: - Bachelor's or Master's degree in computer science or related field - At least 6-9 years of Data engineering or Software development experience