JobKityaari logo

Job Details

DATA & AI - Ops - Engineer

Noida

Not Disclosed

Requirements:

  • Strong programming skills in Python, PySpark, SQL.
  • Expertise in ML frameworks: scikit-learn, TensorFlow, PyTorch.
  • Experience with model serialization formats (pickle, ONNX, TorchScript).

Job Description:

  • Deploy models into production using Docker, Azure ML, AWS SageMaker, and Vertex AI with scalable serving frameworks.
  • Develop CI/CD pipelines for ML workflows using GitHub Actions, MLflow CI/CD integrations, and container registries.
  • Implement continuous training (CT), continuous integration (CI), and continuous delivery (CD) practices for ML systems.
  • Automate data ingestion, preprocessing, and feature pipelines with PySpark and SQL.
  • Monitor model performance, drift, and data quality in production environments.
  • Implement logging, alerting, and observability for ML models and pipelines.

Skills Required:

  • Strong programming skills in Python, PySpark, SQL.
  • Familiarity with Kubernetes for scaling ML workloads.
  • Experience with feature stores and monitoring tools (Feast, WhyLabs, Evidently AI, Prometheus, Grafana).
  • Knowledge of data governance and compliance (GDPR, HIPAA, etc.).
  • Exposure to large-scale distributed systems and real-time inference.