Data Engineer
Baselayer is an intelligent business identity platform trusted by over 2,200 financial institutions. They are seeking a Data Engineer to build and scale their data infrastructure, focusing on data reliability, performance, and quality while collaborating closely with Product and Engineering teams.
Responsibilities
- Design, build, and maintain scalable data pipelines that ingest, clean, validate, and transform data from internal systems and external sources
- Own data reliability and quality through monitoring, alerting, lineage, and validation frameworks
- Build and maintain data models and curated datasets that support analytics, dashboards, customer reporting, and downstream ML use cases
- Partner with Engineering to define best practices for data architecture, storage, access controls, and performance
- Implement orchestration and scheduling for batch and near-real-time workflows as needed
- Optimize pipeline performance, cost, and scalability as data volumes grow
- Develop and maintain documentation and runbooks for pipelines, datasets, and operational procedures
- Identify data gaps and instrumentation needs, and work with engineering teams to improve event capture and logging
Skills
- 1 to 3 years of experience in data engineering, analytics engineering, or backend engineering with significant data pipeline ownership
- Strong Python skills and experience building production-grade data workflows
- Strong SQL skills with experience designing data models and transforming large datasets
- Experience building and maintaining ETL or ELT pipelines and working with data warehouses or analytics databases
- Familiarity with orchestration tools and workflow scheduling (for example Airflow, Dagster, Prefect, or similar)
- Strong understanding of data quality, testing, observability, and operational best practices
- Comfort working with large-scale datasets and troubleshooting performance issues
- Ability to communicate clearly with technical and non-technical stakeholders
- Experience working with identity, fraud, risk, compliance, or other regulated datasets
- Experience integrating with external data sources, APIs, and government or registry data
- Familiarity with streaming or near-real-time data patterns
- Highly feedback-oriented with a desire for continuous improvement
Benefits
- Equity package
- Unlimited vacation
- Comprehensive health coverage
- 401(k) with company match
Company Overview
Apply To This Job