Job Details
Overview The primary focus would be to lead data architecture activities for critical projects This role will be responsible for architecting, designing, and implementing Advanced Analytics capabilities within the Azure Data Lake, Databricks, and other related ETL technologies Satisfy project requirements adhering to enterprise architecture standards This role will translate business requirements into technical specifications, including data streams, integrations, transformations, databases, and data warehouses, and implement cutting-edge solutions by building Semantic and Virtual data layers to provide faster and federated query execution.
This role will also be actively involved in the delivery of the solutions Responsibilities Lead Data Architecture for critical data and analytics projects Drive and Deliver Data Architecture deliverables such as conceptual, logical, and physical architecture Partner with Enterprise Architecture (Data & Analytics) and ensure the usage of standard patterns Partner with project leads, IT Leads, Security and Enterprise Architecture team members in architecting end to end solutions.
Gain Architecture alignment and sign-off and guide the project team during implementation QualificationsYears of Experience: Bachelor's degree in Computer Science, MIS, Business Management, or related field 7 + years' experience in Information Technology 7 + years' experience in Data Warehouse, Data Lake, and related technologies 5 years' experience creating data models for complex analytical applications.
5 years' experience designing data architectures for flexibility and performanceMandatoryTech Skills: 5+ years of experience in Teradata and Hadoop ecosystem (Ex: Hive, Spark, Kafka, HBase) 3 to 5 years of hands-on experience in architecting, designing, and implementing data ingestion pipes for batch, real-time, and streams on the Azure cloud platform at scale 1 to 3 years of experience in using ETL tools like Infoworks/ Nifi/ or similar tools, especially for a large volume of data.
1 to 3 years of hands-on experience on Databricks 3 to 5 years of working experience on Azure cloud technologies like Spark, IoT, Synapse, Cosmosdb, Loganalytics, ADF, ADLS, Blob storage, etc.
1 to 3 years of experience in evaluating emerging technologies is required 1 to 3 years of experience in Python/Pyspark/Scala to build data processing applications Having experience in extracting/querying/Joining large amounts of data sets at scale.
Required experience in Azure security implementation like (Authentication, authorization, Network security, Endpoint etc)Non-MandatoryTech Skills: Highly analytical, motivated, decisive thought leader with solid critical thinking ability to quickly connect technical and business 'dots' Has strong communication and organizational skills and can deal with ambiguity while juggling multiple priorities and projects at the same time Drives Non Functional requirements like ( application scalability, availability, Capacity planning, Disaster recovery strategy, Performance measure etc).