Job Description
We are looking for energetic, high-performing and highly skilled GCP Data Engineers to help shape our technology and product roadmap.
Job Description:
Responsibilities:
- Develop and maintain large scale data processing pipeline using PySpark Data Proc, Big Query and SQL.
- Use Big Query and Data proc to migrate existing Hadoop/Spark/Hive workloads to Google Cloud.
- Proficient in Big Query to carry out batch and interactive data analysis.
- Function as member of an agile team by contributing to software builds through consistent development practices (tools, common components, and documentation)
- Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality
- Enable the deployment, support, and monitoring of software across test, integration, and production environments
Minimum Qualifications:
This ...
Ready to Apply?
Submit your application today and join our talented team at IntraEdge.
Submit ApplicationJob Details
- Location India, Haryana
- Job Type Full-time
- Category Computer Occupations
- Posted Date February 26, 2026
- Application Deadline April 07, 2026