undefined

Senior Data Engineer

Posted 2 months ago

Details

undefined
Compensation
Not disclosed
undefined
Industry
Technology
undefined
Time commitment
Full time
undefined
Company size
Not disclosed

Skills

Python
Data Engineer
Airflow

Job description

A U.S.-based decision intelligence platform that is helping enterprises reduce time-to-decision with its state-of-the-art solutions, is looking for a Senior Data Engineer. The selected candidate will be responsible for working on all aspects of data, be it platform and infra buildout, fronting the core platform, pipeline engineering, or writing tooling/services for augmenting. The company is focused on building a revolutionary SaaS platform adept at forecasting and optimizing supply chain and finance. So far, the company has amassed over $2.4 Million through seed funding.
 
Job Responsibilities:
  • Architect and implement a robust data platform for company products
  • Build and deliver the next-gen data lifecycle management suite of tools/frameworks
  • Support serverless, real-time, API-based use cases, by including ingestion and consumption on the top of the data lake, along with batch (mini/micro)
  • Design and develop highly efficient, reliable, and observable data pipelines using Airflow, dbt, PostgreSQL, ClickHouse, ElasticSearch, and other technologies
  • Ensure easy exploration, profiling as well as lineage requirements of the data lake by developing and exposing the metadata catalog
  • Convert user requirements and company ideas into real products
  • Build quick POCs to develop the data platform iteratively
  • Ensure easy access and usability of data by Data Scientists and Analysts by making it discoverable
  • Empower testing and productization of various ML models by Data Science teams
  • Develop high-quality code and guide junior developers 
  • Ensure cost-effectiveness and timely availability of deliverables
  • Conduct code and design reviews to streamline operations and support peers
  • Coordinate CI/CD activities, testing automation frameworks, and other relevant activities 
  • Demonstrate technical capabilities, solutions, features, and, considerations in business terms
  • Communicate issues, risks, and status, precisely and effectively
Job Requirements:
  • Bachelor’s/Master’s degree in Engineering, Computer Science (or equivalent experience)
  • At least 5-8+ years of relevant experience as a Data Engineer
  • Proficiency with Python, Data Engineering, and Airflow
  • Experience developing software with Python, Pandas, SQLAlchemy, and Flask
  • Extensive experience with OOPs (Object-Oriented Design), coding, and testing patterns
  • Demonstrable experience working with engineering software platforms and a multitude of data infrastructures
  • Practical experience building a complete data platform using a slew of open-source technologies
  • Skilled in building metadata, lineage, observability, and discoverability for the data platform
  • Familiarity with DevOps best practices such as containerization, CI/CD, secrets management, blue-green deployments, and others
  • Expertise with the principles of data modeling, distributed computing, and building optimized SQL queries
  • Preference to candidates aware of the latest developments in the field of Machine Learning models
  • Ability to efficiently collaborate with and support other data scientists
  • Understanding of dbt, ClickHouse, PostgreSQL, Kubernetes, and GraphQL is a big plus
  • Familiarity with Google Cloud Platform (e.g. GCS, Cloud Composer, BigQuery) is a plus
  • Outstanding communication and interpersonal skills
  • Good command of verbal and written English

How to get hired by clients?

01

Sign up

Create a profile by sharing with us your personal and professional details.
02

Assessment

Take our online talent assessment for skills and competencies evaluation.
03

Offer

Get matched to in-demand jobs and accelerate your freelance career.

Interested in more opportunities like these?

Join now