W
Platform Engineer
$ 9,000 - $ 12,000 / month
Checking job availability...
Original
Simplified
- The ideal candidate(s) should have 6-10 years of hands-on data engineering expertise and experienced in building at least one python / spark / scala based data ingestion, transformation, and egress framework
- Working experience on Data Lake Implementations on Azure cloud / BigData technologies
- Cloud experience in one or more of Azure, AWS or GCP environment
- Expertise with modern data warehouse design patterns and tools
- Good understanding of data architecture principles including data modelling
- Should be able to tune Spark and Scala jobs for performance
- Deep understanding of distributed computing systems
- Should be familiar with Data Warehousing concepts and physical Data modelling techniques
- Able to stitch together a Real-time, Streaming, kappa and lambda solution
- Proficient with the Big data tools/technologies like Hive, Hadoop, Yarn, Kafka, Spark Streaming
- Experience in AZURE cloud data technologies (ADLS2, CosmosDB, AKS, AEH, ADF, Synapse)
- Familiar working with DevSecOps tools, methodologies - CI/CD (i.e., Jenkins, bitbucket, GitHub), Azure DevOps i.e best practices for code deployment
- The ideal candidate should have hands-on experience working with cloud service models, including SaaS, PaaS, and IaaS
- Proficiency in Infrastructure as Code (IaC) using Terraform is essential
- Experience with Chef configuration management is highly desirable
- Implement and enforce cloud security best practices, ensuring data privacy, compliance, and efficient governance through robust identity management, encryption, and monitoring frameworks
- Working knowledge in job Orchestration (Control-M, Airflow)
- Work with Architects, Technical Leads and Business Teams and contributes to the development of technical designs
- Strong knowledge of various data structures and the ability to extract data from various data sources
- Provides technical database consultation on application development, global infrastructure, and other database administration efforts related to specific DBMS
- Experience in writing complex SQL queries preferred
- Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows
- Good temperament - buildings relationships and stakeholder management skills with an ability to interact with a diverse group of stakeholders and proactively provide actionable data
- The ideal candidate(s) should have 6-10 years of hands-on data engineering expertise and experienced in building at least one python / spark / scala based data ingestion, transformation, and egress framework
- Working experience on Data Lake Implementations on Azure cloud / BigData technologies
- Cloud experience in one or more of Azure, AWS or GCP environment
- Expertise with modern data warehouse design patterns and tools
- Good understanding of data architecture principles including data modelling
- Should be able to tune Spark and Scala jobs for performance
- Deep understanding of distributed computing systems
- Should be familiar with Data Warehousing concepts and physical Data modelling techniques
- Able to stitch together a Real-time, Streaming, kappa and lambda solution
- Proficient with the Big data tools/technologies like Hive, Hadoop, Yarn, Kafka, Spark Streaming
- Experience in AZURE cloud data technologies (ADLS2, CosmosDB, AKS, AEH, ADF, Synapse)
- Familiar working with DevSecOps tools, methodologies - CI/CD (i.e., Jenkins, bitbucket, GitHub), Azure DevOps i.e best practices for code deployment
- The ideal candidate should have hands-on experience working with cloud service models, including SaaS, PaaS, and IaaS
- Proficiency in Infrastructure as Code (IaC) using Terraform is essential
- Experience with Chef configuration management is highly desirable
- Implement and enforce cloud security best practices, ensuring data privacy, compliance, and efficient governance through robust identity management, encryption, and monitoring frameworks
- Working knowledge in job Orchestration (Control-M, Airflow)
- Work with Architects, Technical Leads and Business Teams and contributes to the development of technical designs
- Strong knowledge of various data structures and the ability to extract data from various data sources
- Provides technical database consultation on application development, global infrastructure, and other database administration efforts related to specific DBMS
- Experience in writing complex SQL queries preferred
- Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows
- Good temperament - buildings relationships and stakeholder management skills with an ability to interact with a diverse group of stakeholders and proactively provide actionable data