brand logo
View All Jobs

Data Engineer (DIG01127)

Product Management & Content
Mumbai
Onsite
About Us
TimesPro strives to embody the values of Education 4.0: Learner-centric, industry-relevant, role-specific, and technology-enabled, with a goal of making learning accessible for anyone who seeks to grow.
TimesPro aims to fulfil aspirations of by making excellence accessible through learner-centric innovations and global collaborations.Established in 2013, we are the award-winning H. EdTech initiative of the Times Of India Group, catering to the learning needs of Indians with aspirations of career growth.
We offer a variety of created and curated learning programmes across a range of categories, industries, and age groups. They include employment-oriented Early Career courses across BFSI, e-Commerce, and technology sectors; Executive Education for working professionals in collaboration with premier national and global educational institutions; and Enterprise Solutions for learning and development interventions at the organisational level.
Visit us at https://www.timespro.com
Job Description
Job Title: Data Engineer (Google Cloud Platform)
Job Summary:
We are seeking a skilled Data Engineer with expertise in Google Cloud Platform (GCP) to design, build, and maintain scalable data pipelines. This role involves working with BigQuery, Cloud Storage, Dataflow, and other GCP services to process and analyze large datasets. Additionally, the candidate will support machine learning model deployment and hosting, ensuring seamless integration with data pipelines.
The ideal candidate will have a strong understanding of ETL processes, data modeling, cloud infrastructure, and MLOps, ensuring efficient and reliable data movement, transformation, and model serving.
Key Responsibilities:
  • Design, develop, and optimize ETL pipelines to process structured and unstructured data.
  • Build and maintain real-time and batch data processing solutions using GCP services like BigQuery, Dataflow, Pub/Sub, and Cloud Functions.
  • Ensure data quality, security, and governance across all pipelines.
  • Monitor data workflows, troubleshoot failures, and improve performance.
  • Work with stakeholders to understand data requirements and implement scalable solutions.
  • Automate data processes using Python, SQL, and infrastructure-as-code tools like Terraform and Cloud Deployment Manager.
  • Optimize data warehouse solutions for performance and cost efficiency.
  • Deploy and manage machine learning models using services like Vertex AI, AI Platform, Cloud Run, and Cloud Functions.
  • Support model versioning, monitoring, and retraining pipelines in collaboration with Data Scientists and ML Engineers.
  • Implement MLOps best practices for CI/CD pipelines integrating ML model training, deployment, and serving.
Preferred Qualifications:
  • Experience integrating BigQuery with third-party tools like Salesforce, Looker, or Power BI.
  • Exposure to model performance monitoring and automated retraining pipelines.
  • Knowledge of feature engineering, feature stores, and ML pipeline orchestration.
Education Requirements:
  • BE/BTech/Masters/MCA in Computer Science, Information Technology, or a related field.
Job Requirement
Required Skills & Qualifications:
  • 3+ years of experience in Data Engineering, preferably on Google Cloud Platform (GCP).
  • Proficiency in SQL and Python for data processing and automation.
  • Experience with BigQuery, Cloud Storage, Dataflow, Pub/Sub, and Cloud Composer (Airflow).
  • Strong understanding of data modeling, ETL design, and pipeline orchestration.
  • Hands-on experience with streaming and batch data processing.
  • Familiarity with IAM roles, security policies, and cost optimization strategies in GCP.
  • Experience with CI/CD for data pipelines (Git, Cloud Build, Terraform).
  • Knowledge of ML model deployment using Vertex AI, AI Platform, or Cloud Run.