We’re a newly formed data & analytics team of three. Together, we’ve successfully built the foundation of our modern data platform using Airflow, dbt, Snowflake, and Looker. We're actively cleaning, aligning, and integrating multiple data sources, including marketing, finance, and product usage data, into a centralized and governed system.
We’re looking for a Data Engineer to help us transform fragmented data sources into robust, scalable pipelines and analytical assets that support self-serve reporting and advanced analysis across our product and go-to-market teams.
As our Data Engineer you will be responsible for:
- Build, monitor, and maintain scalable ELT pipelines using Airflow, dbt, and Snowflake
- Design and implement efficient data models to support analytics and BI use cases
- Collaborate with data analysts and stakeholders to ensure data reflects business logic accurately
- Maintain and improve integrations with systems like PostgreSQL, ClickHouse, Chargebee, HubSpot, and others
- Implement testing and data quality checks across pipelines and reporting layers
- Contribute to the evolution of our governance strategy, including asset ownership and documentation
Our Current Stack:
- Data Sources: PostgreSQL, ClickHouse, HubSpot, Chargebee, QuickBooks, Zendesk
- ETL/ELT: Airflow, dbt, Python
- Data Warehouse: Snowflake
- Orchestration: Airflow
- BI & Governance: Looker, Redash, OpenMetadata
- Version Control: GitLab
Requirements
To be successful in this role you should have:
- 4+ years of experience in Data Engineering roles, ideally within B2B SaaS or fast-paced tech environments
- Strong proficiency in SQL and Python
- Practical experience working with orchestration tools (Airflow) and ELT tools (dbt)
- Solid understanding of data warehousing principles and data modeling best practices in Snowflake
- Comfort handling complex, high-granularity datasets and optimizing for performance
- Experience integrating SaaS tools like HubSpot or Chargebee
- Experience implementing monitoring, testing, and alerting for data workflows
- A clear, structured thinker who values high-quality, maintainable code
Benefits
What's in it for you?
- A focus on professional development
- Interesting and challenging projects
- Fully remote work with flexible working hours, that allows you to schedule your day and work from any location worldwide
- Paid 24 days of vacation per year, 10 days of national holidays, and unlimited sick leaves
- Compensation for private medical insurance
- Co-working and gym/sports reimbursement
- Budget for education
- The opportunity to receive a reward for the most innovative idea that the company can patent
By applying for this position, you agree with the CloudLinux Privacy Policy - https://cloudlinux.com/privacy-policy and give us your consent to maintain and process your personal data in this respect. Please read our Privacy Policy for more information - https://cloudlinux.com/privacy-policy .