Your "Self-Service" BI Platform is a Self-Service Chaos Engine.
Looker represents a powerful and disciplined approach to business intelligence. Unlike traditional BI tools that focus on drag-and-drop chart building, Looker's power comes from its semantic modeling layer, LookML. This "data as code" philosophy promises to create a single, governable source of truth, where business metrics are defined once, version-controlled in Git, and reused by everyone. It promises to enable true self-service analytics, where business users can explore data with confidence, knowing that the underlying definitions are consistent and trustworthy.
But this promise is only fulfilled when your BI platform is built by engineers who have truly internalized this "code-first" discipline. When your Looker instance is managed by analysts who are more comfortable with spreadsheets than with Git, your LookML project does not become a clean semantic layer. It becomes a sprawling, unmaintainable mess of duplicated code, inconsistent definitions, and performance bottlenecks. Your "single source of truth" becomes a "single source of confusion."
An analyst who can create a Look in the UI is not a Looker expert. An expert can design a modular and extensible LookML model. They can write complex derived tables and persistent derived tables (PDTs) to optimize performance. They can implement a granular access control strategy using user attributes. They treat their LookML project with the same rigor as any other software product, with code reviews, automated testing, and a clear deployment process. This playbook explains how Axiom Cortex finds the developers who possess this deep, architectural discipline.
Traditional Vetting and Vendor Limitations
A nearshore vendor sees "Looker" on a résumé and assumes competence. The interview might involve asking the candidate to build a simple visualization. This superficial approach completely fails to test for the critical skill that actually matters in Looker: the ability to write clean, scalable, and maintainable LookML.
The predictable and painful results of this flawed vetting process are common:
- LookML Spaghetti: Your project consists of a few massive model files with thousands of lines of duplicated code. There is no clear structure, and a change to a single dimension definition requires a global search-and-replace across the entire project.
- Performance Disasters: Dashboards take minutes to load because every query is running against raw, un-aggregated data. The team does not know how to use aggregate awareness or PDTs to pre-compute common queries and provide instant results.
- Metric Inconsistency: The definition of "revenue" is different in three different Explores. The sales team, the finance team, and the marketing team are all looking at different numbers, leading to a complete loss of trust in the BI platform.
- The "Git as a Save Button" Anti-Pattern: The team uses Git, but only as a way to save their work. There are no pull requests, no code reviews, and no automated testing. A developer can push a breaking change directly to production without any oversight.
The business impact is severe. You have invested in a powerful, governable BI platform, but you have ended up with a chaotic, untrustworthy, and slow system that is failing to deliver the promised value.
How Axiom Cortex Evaluates Looker Developers
Axiom Cortex is designed to find the engineers who think like software developers first and BI analysts second. We test for the practical skills and the "data as code" discipline that are essential for building and maintaining a professional BI platform with Looker. We evaluate candidates across four critical dimensions.
Dimension 1: LookML Modeling and Architecture
This is the core competency of an elite Looker developer. This dimension tests a candidate's ability to design a LookML project that is modular, maintainable, and scalable.
We provide candidates with a business problem and a database schema and ask them to design the LookML model. We evaluate their ability to:
- Design a Clean Project Structure: A high-scoring candidate will immediately talk about organizing the project with a base layer of views and a separate layer of models. They will use `include` statements to build a modular and DRY project.
- Master Joins and Relationships: Can they correctly define the relationships between views? Do they understand the difference between a `one_to_one`, `one_to_many`, and `many_to_many` relationship and the performance implications of each?
- Use `extends` for Reusability: Do they know how to use `extends` to share common sets of fields or joins across multiple Explores, avoiding code duplication?
Dimension 2: Performance Optimization
A BI tool is only useful if it's fast. This dimension tests a candidate's ability to build a performant Looker instance.
We present a slow dashboard scenario and evaluate if they can:
- Implement Persistent Derived Tables (PDTs): Can they identify queries that are good candidates for pre-aggregation and build a PDT to materialize the results? Do they understand how to configure the trigger and persistence for a PDT?
- Use Aggregate Awareness: A high-scoring candidate will be able to design aggregate tables and write aggregate awareness rules in LookML to allow Looker to intelligently rewrite queries to use the smallest, fastest table possible.
- Diagnose Slow Queries: Are they familiar with using the SQL Runner and the query analysis features in Looker to diagnose and optimize slow-running queries?
Dimension 3: Data Governance and Security
Looker provides powerful tools for governing access to data. This dimension tests a candidate's ability to use these tools effectively.
We evaluate their knowledge of:
- Access Filters and User Attributes: Can they design a data access model using user attributes and `access_filter` parameters to implement row-level security, ensuring that users only see the data they are authorized to see?
- Content Access and Permissions: Can they design a folder and permissions structure that controls who can view, edit, and develop content?
Dimension 4: "Data as Code" Discipline
An elite Looker developer treats their LookML project as a software project. This dimension tests for that professional discipline.
Axiom Cortex assesses how a candidate:
- Uses Git for Collaboration: Do they have a disciplined workflow for using Git, including creating branches for new features, opening pull requests for code review, and resolving merge conflicts?
- Writes Data Tests: Do they know how to write data tests in LookML to validate the quality and integrity of their data?
From a BI Tool to a Data Platform
When you staff your data team with Looker developers who have passed the Axiom Cortex assessment, you are investing in a team that can build a true data platform, not just a collection of dashboards.
A media company was struggling with a chaotic Looker environment. Every business user was building their own Looks and dashboards, leading to inconsistent metrics and a complete lack of trust. Using the Nearshore IT Co-Pilot, we assembled a "BI Platform" pod of two elite nearshore Looker developers.
In their first quarter, this team:
- Built a Governed LookML Model: They rebuilt the company's LookML project from the ground up, creating a single, modular, and well-documented model that served as the "single source of truth" for all business metrics.
- Implemented a Pull Request Workflow: They locked the project into production mode and implemented a mandatory pull request and code review process for all changes to the LookML.
- Empowered Self-Service Users: They trained the company's business users on how to use the new, governed Explores to safely build their own content with confidence.
The result was transformative. The company finally had a single, trusted source of truth for their data. The business users were empowered to answer their own questions, and the data team was freed from endless ad-hoc requests and could focus on building more strategic data products.
What This Changes for CTOs and CIOs
Using Axiom Cortex to hire for Looker competency is not about finding a tool expert. It is about insourcing the discipline of "data as code" and building a truly governable and scalable business intelligence platform. It ensures your investment in a powerful tool like Looker yields a real, strategic return.