As a Data Platform Engineer at Hailify, you will architect and implement cloud-native data pipelines and infrastructure to enable analytics and machine learning on large, rich datasets. You will be responsible for ensuring datasets are appropriately sourced, extracted, cleansed/transformed, validated/reconciled and made analytical and machine learning purposes.
Responsibilities
- Architect a system: Design and propose a data reporting environment across all of Hailify’s data systems, potentially including data warehouses, data lakes, master data management systems, etc.
- Build a data pipeline: Scope, define and document data warehousing/extract, transform, load (ETL/ELT) requirements through implementation and testing. Provide production support to ensure stability and efficiency.
- Reconcile/QA data: Develop automated systems to monitor data integrity/quality and data lineage. Be able to validate and reconcile big data sets.
- Find data platform solutions: Lead the research, design, and implementation of maintainable, scalable, reusable and performant software solutions that meet functional and non-functional requirements and that are aligned with our strategic direction.
- Strive for continuous improvement: Contribute to the continuous improvement of our overall data infrastructure. Identify ways data can be more accurate, trustworthy, and timely. Provide thought leadership around best practices and emerging concepts in the data analytics domain. Find opportunities for automating operational database activities to enable the organization to scale and reduce costs.
Requirements
- 5+ years of experience in data management, particularly in ETL/ELT, data integration, data engineering, data design and/or data modeling
- Experience in building, improving, and maintaining complex ETL and/or next-gen ELT pipelines
- Excellent programming skills in writing stored procedures, queries, views, user defined functions, cursors and common table expressions using SQL
- Keen understanding of data structures, data modeling, and software architecture
- Experience building cloud-native applications and supporting technologies / patterns / practices including: AWS/GCP/Azure, Docker, CI/CD, DevOps, and micro-services
- Working knowledge of programming or scripting languages (e.g., Python)
- Preferred degrees in Computer Science, Engineering, Information Technology or similar
- Comfortable working during India (primary) / United States Eastern (secondary) business hours
About Hailify
Hailify’s mission is to make transportation efficient, equitable, and profitable through a programmatic delivery exchange that seamlessly connects merchants and drivers across a virtual network. We are a high growth, venture-backed company, with investments from firms like Triple Impact Capital, Boro Capital, and Amber Capital.
Our Culture
Being first to market with a product that solves a core problem for an entire industry motivates us to experiment, build and scale solutions thoughtfully through data-driven collaboration.
General Attributes
A great candidate for a position at Hailify would have the following attributes:
- Collaborative - Able to engage in thoughtful discussion without ego
- Independent - Able to work independently of management
- Accountable - Takes responsibility quickly and thoughtfully
- Data Driven - Uses data rigorously in the decision making process
Hailify is an equal opportunity employer. In accordance with applicable law, we prohibit discrimination against any applicant or employee based on any legally-recognized basis, including, but not limited to: race, color, religion, sex (including pregnancy, lactation, childbirth or related medical conditions), sexual orientation, gender identity, age (40 and over), national origin or ancestry, citizenship status, physical or mental disability, genetic information (including testing and characteristics), veteran status, uniformed service member status or any other status protected by federal, state or local law.