Checking job availability...
Original
Simplified
Key Responsibilities
- Data Processing: Manage daily operations of patent data and related data, including integration, parsing, storage, and backup, ensuring data security, stability, and availability.
- Data Requirement Analysis and Implementation: Analyze and understand new data requirements, design data processing solutions, write relevant documentation, and implement code.
- Data Quality Management: Monitor data quality, design and implement targeted optimization solutions, and continuously improve data integrity and accuracy.
- Data Processing Framework Optimization: Optimize data processing frameworks and performance, enhancing the efficiency of system data handling to support rapid iteration of product data.
- Architecture Collaboration: Work closely with data architects, participate in architecture upgrades and updates, and enhance system scalability and stability.
Qualifications
- Educational Background: Bachelor’s degree or above in computer science, data engineering, software engineering, or related fields; at least 3 years of experience in data development.
- Technical Skills: Proficiency in Java or Python with a solid programming foundation; Familiarity with basic components of big data processing, such as Flink and Spark, with experience in performance optimization preferred; Familiarity with Linux systems and common operations.
- Ability Requirements: Strong sense of responsibility and proactive learning ability, able to adapt flexibly in a fast-paced environment; Clear thinking with excellent communication skills and team spirit, able to collaborate effectively with team members from diverse cultural backgrounds; Basic understanding of data privacy and compliance, capable of working in a regulatory environment.
Key Responsibilities
- Data Processing: Manage daily operations of patent data and related data, including integration, parsing, storage, and backup, ensuring data security, stability, and availability.
- Data Requirement Analysis and Implementation: Analyze and understand new data requirements, design data processing solutions, write relevant documentation, and implement code.
- Data Quality Management: Monitor data quality, design and implement targeted optimization solutions, and continuously improve data integrity and accuracy.
- Data Processing Framework Optimization: Optimize data processing frameworks and performance, enhancing the efficiency of system data handling to support rapid iteration of product data.
- Architecture Collaboration: Work closely with data architects, participate in architecture upgrades and updates, and enhance system scalability and stability.
Qualifications
- Educational Background: Bachelor’s degree or above in computer science, data engineering, software engineering, or related fields; at least 3 years of experience in data development.
- Technical Skills: Proficiency in Java or Python with a solid programming foundation; Familiarity with basic components of big data processing, such as Flink and Spark, with experience in performance optimization preferred; Familiarity with Linux systems and common operations.
- Ability Requirements: Strong sense of responsibility and proactive learning ability, able to adapt flexibly in a fast-paced environment; Clear thinking with excellent communication skills and team spirit, able to collaborate effectively with team members from diverse cultural backgrounds; Basic understanding of data privacy and compliance, capable of working in a regulatory environment.