T
Big Data Developer
$ 6,500 - $ 9,000 / month
Checking job availability...
Original
Simplified
Job Description
Big Data Developer
Experience: 3 10 Years
Job Requirements
Core Skills & Expertise
- Strong understanding of concurrent software systems, ensuring scalability, maintainability, and robustness.
- Expertise in designing application solutions within the Hadoop ecosystem.
- Deep knowledge of Hive, HDFS, YARN, Spark, Spark SQL, Scala, and PySpark.
- Proficiency in handling HDFS file formats such as Parquet, ORC, Sequence, and their use cases.
- Solid understanding of data warehousing systems.
- Experience in scripting languages such as Shell and Python.
- Familiarity with HortonWorks distribution and SQL engines (Tez, MR).
- Exposure to Java, RESTful services, and Maven is a plus.
Automation & Deployment
- Experience in Control-M development and resource monitoring using Grafana.
- Ability to create automation scripts in Jenkins for builds, test frameworks, and application configurations.
- Hands-on experience in scalable application deployment using tools like Bitbucket, Jenkins, and Azure DevOps (ADO).
Skill Set
Mandatory Skills:
- Big Data Technologies: Hive, HDFS, Spark, Scala, PySpark
Good to Have:
- Schedulers: Control-M
- ETL Tools: Dataiku
- Scripting & OS: Unix/Shell scripting
- Integration Services: FileIT, MQ, and others
- CI/CD Tools: Jenkins, Jira, ADO DevOps suite
Job Description
Big Data Developer
Experience: 3 10 Years
Job Requirements
Core Skills & Expertise
- Strong understanding of concurrent software systems, ensuring scalability, maintainability, and robustness.
- Expertise in designing application solutions within the Hadoop ecosystem.
- Deep knowledge of Hive, HDFS, YARN, Spark, Spark SQL, Scala, and PySpark.
- Proficiency in handling HDFS file formats such as Parquet, ORC, Sequence, and their use cases.
- Solid understanding of data warehousing systems.
- Experience in scripting languages such as Shell and Python.
- Familiarity with HortonWorks distribution and SQL engines (Tez, MR).
- Exposure to Java, RESTful services, and Maven is a plus.
Automation & Deployment
- Experience in Control-M development and resource monitoring using Grafana.
- Ability to create automation scripts in Jenkins for builds, test frameworks, and application configurations.
- Hands-on experience in scalable application deployment using tools like Bitbucket, Jenkins, and Azure DevOps (ADO).
Skill Set
Mandatory Skills:
- Big Data Technologies: Hive, HDFS, Spark, Scala, PySpark
Good to Have:
- Schedulers: Control-M
- ETL Tools: Dataiku
- Scripting & OS: Unix/Shell scripting
- Integration Services: FileIT, MQ, and others
- CI/CD Tools: Jenkins, Jira, ADO DevOps suite