About this role
Your mission?
Turn our rich operational and product telemetry into trustworthy, timely, secure, and business-ready data. Build the semantic layer and pipelines that power GTM, Product, Operations, and Partner Success—so decisions are faster, experiments are rigorous, and growth is compounding.
The role
What are we looking for?
We’re hiring a Principal Data Scientist / Developer—part architect, part builder, part strategist—to own our end-to-end data foundation and accelerate impact from insights to shipped improvements.
You’ll partner with Product and Engineering leads to design the next-gen data platform (batch and streaming), define metrics and contracts, and ship models (analytics, ML, and LLM-assisted) that lift activation, expansion, and reliability.
Reporting to the COO, you’ll work hands-on while coaching others, translating business questions into robust, production-grade data products. This role is based in Madrid, Spain, with a hybrid work model.
This isn’t about dashboards alone—it’s about building the engines that move the business.
What you’ll own and drive
End-to-end data infrastructure: From ingestion to orchestration to serving (batch/stream), optimizing cost, reliability, and speed.
Data quality, integrity, and availability: CI/CD for data, contracts, tests, and observability as first-class citizens.
Pipelines & processes: Design/optimize ingestion, transformation, and analysis workflows for structured and unstructured data.
Insights to action: Partner with GTM, Product, Operations, and Partner Success to prioritize questions, run experiments, and land changes.
Models & semantics: Own core metrics, semantic layers, reporting models, and feature stores aligned to the partner funnel (SQL→MQL→SQL→ARR).
Governance & collaboration: Lineage, catalog, permissions, and clear documentation so teams can self-serve safely and confidently.
Day-to-day tasks
Design/optimize pipelines (ingestion, transformation, orchestration).
Develop and maintain analytical and ML/LLM models (structured/unstructured; stats/experimentation).
Implement/improve dashboards (Dash/Plotly, Power BI, Tableau, Looker).
Analyze trends/patterns to improve processes, product, and the partner funnel.
Data modeling (dimensional, lakehouse, feature stores).
Data quality & observability (detect/fix inconsistencies).
Coordinate accessibility & usability (catalog, lineage, permissions).
Automate extraction/transformations, data testing, and deployment.
Document best practices, standards, and methodologies.
Tooling you’ll use
Linux, bash, Git, Containerization; SQL/NoSQL; Flask/FastAPI for APIs; Data stack: Pandas, scikit-learn, Polars, DuckDB; DL stack: PyTorch, HuggingFace, TensorFlow; Dash/Plotly/Power BI; Slack; Microsoft O365; Jira; Confluence; Cloud platforms & data services.
What will make us fall in love with you as a candidate?
It will be a perfect match if…
You are proactive. You spot data gaps early and propose scalable solutions without waiting for direction.
You prioritize clarity over clutter. You define crisp metrics and contracts, and make complex systems understandable.
You rock in strategic storytelling. You turn analysis into compelling narratives that move execs and engineers to act.
You are brave & creative. You experiment fast, embrace feedback, and solve problems others deem “too technical” or “too messy.”
You simplify the complex. You design data models and APIs that make cloud workflows intuitive for partners and internal teams.
Ownership. You can lead the full data lifecycle—from ingestion to models to serving and observability—hands-on and end-to-end.
You love being the bridge between tech and business. You translate questions into analysis, and insights into shipped changes.
For you, user and business impact go first. You champion data integrity and responsible AI to improve real outcomes.
Trust. You give and receive feedback thoughtfully, with the intent to build, and you lead with transparency.
Requirements that are important for us
Please, only apply if you match these.
A principal level: 8+ years across Data Science/Analytics/Engineering (including leading initiatives), with shipped data products and measurable business impact.
Mastery of data tooling: Python (preferred) and/or R; SQL & NoSQL; visualization (Power BI, Tableau, Looker, Dash/Plotly); ML/LLM/stats; cloud data services. You use AI responsibly to speed up quality work.
English skills: Intermediate level required — Level B1 minimum (B2+ is a plus) to collaborate with international teams.
Data
SQL & NoSQL excellence: Schema design, performance tuning, partitioning, indexing.
ETL/ELT & orchestration: Reliable pipelines (batch/stream) with monitoring, alerting, and recovery.
Modeling: Dimensional modeling, lakehouse patterns, and feature stores that serve analytics and ML.
ML/LLM & experimentation: From baseline models to rigorous A/B tests and causal inference on structured/unstructured data.
Visualization: Power BI, Tableau, Looker, and Dash/Plotly for decision-ready views.
Observability & governance: Data tests, lineage, contracts, catalogs, and CI/CD for data.
Quality by design: Freshness, completeness, accuracy, and SLAs as non-negotiables.
Engineering
System design & architecture: Comfortable with monoliths, microservices, and event-driven patterns.
APIs & services: Build/consume services with Flask/FastAPI; productionize models and data apps.
Cloud & data services: Practical fluency with cloud storage/compute and managed data offerings.
Core stack: Python, Pandas, Polars, DuckDB, scikit-learn; DL with PyTorch/TensorFlow/HuggingFace.
DevX: Linux, bash, Git, containerization; automation, testing, and CI/CD for code and data.
Security-minded: Design with least privilege, encryption, and secure defaults.
Security & Compliance
The Data Analytics team ensures secure and responsible data use in line with ISO 27001, ENS, and HDS—protecting confidentiality, integrity, and availability.
We anonymize and encrypt where applicable, enforce access controls, uphold data quality, and ensure that analysis tools/platforms are compliant. You’ll work closely with our Security team to bake this into architecture and day-to-day operations.


