How Gorilla deployed Google Gemini in energy retail: Metomic's data governance framework protects sensitive customer data while enabling AI productivity.

How Gorilla deployed Google Gemini in energy retail: Metomic's data governance framework protects sensitive customer data while enabling AI productivity.

Gorilla, a data analytics platform serving the energy retail industry, successfully deployed Google Gemini across their organization while maintaining strict data governance controls required by their highly regulated sector. By partnering with Metomic, they created a framework for secure AI enablement that protects sensitive customer data while unlocking productivity gains for employees.
Key Results:
The energy retail sector operates under stringent data protection requirements. As Christopher Callaghan, Security and Compliance lead at Gorilla, explains: "Energy retailers are quite sensitive when it comes to data. Anything that goes on within the industry hits the news, hence security and compliance ends up being pretty critical to them."
When Gorilla evaluated Google Gemini for their Google Workspace environment, they faced a critical challenge that many enterprises encounter: Gemini's strength as a knowledge retrieval tool became a potential security vulnerability.
The core issue? Gemini respects existing Google Drive permissions—but years of collaborative work had created permission sprawl that made deployment risky.
"The benefit of Drive is that it is amazing in collaboration," Chris notes. "But it is clear that it slowly erodes the basics of role-based access. People are pulled into documents, but documents are constantly evolving. Now with Gemini, they just have to ask the right question and it appears."
The Intern Problem: An intern could now query "What's the CEO's private diary entries?" and potentially access sensitive information they would never have found through traditional file browsing.
Document Evolution: A document shared three years ago for planning purposes might now contain salary information or hiring roadmaps, but original collaborators retain access without knowing the content has changed.
Customer Data Exposure: With Gorilla serving energy retailers handling sensitive pricing and forecasting data, any data leakage could damage client relationships and regulatory standing.
Gorilla built a pyramid governance model:
AI Management Group: Technical leaders, security/compliance professionals, and operations team members who set strategic direction and policy.
Gen AI Interest Group: Early adopters who complete AI training and demonstrate competency. This group tests new tools, identifies use cases, and helps pull along skeptical employees.
"We need these early adopters to drive forward and say, here's all the interesting things we can do, and pull along those people that aren't necessarily so interested," Chris explains.
Gorilla took a phased approach to AI adoption:
Green Zone (Approved): Internal productivity tools for operations, HR processes, and administrative tasks where data sensitivity is lower.
Yellow Zone (Controlled): Google Gemini deployment with extensive access controls and continuous monitoring.
Red Zone (Restricted): Customer data and product features remain off-limits until more mature AI governance controls are in place.
"We can't necessarily add OpenAI as a sub-processor—that's just going to send alarm bells to our customers," Chris notes. "We need to be able to put guardrails around those little pockets in the business."
Gorilla partnered with Metomic to address Gemini's native security limitations:
Automated Content Classification: Metomic scans Google Drive to identify and label sensitive documents based on content patterns (customer data, financial information, personal data, etc.).
Permission Remediation: Files containing sensitive data are either:
Continuous Monitoring: Real-time alerts via Slack when sensitive documents are shared inappropriately or permissions change on classified files.
User Education: Labeling appears directly on files, making employees aware of data sensitivity and their handling responsibilities.
Recognizing that perfect prevention is impossible in collaborative environments, Gorilla implemented quarterly "red team" exercises:
"We're coming up with some kind of adversarial querying that we're just going to have to run quarterly to basically make sure that certain key people—if there was a document that had changed context that they now have access to sensitive information—we can flag it early."
This proactive testing helps identify:
Risk Reduction: Gemini deployed to initial user group with confidence that sensitive customer data remains protected.
Visibility Improvement: Complete mapping of sensitive data locations across Google Workspace, something that didn't exist before the AI deployment project.
Policy Evolution: Shift from specific policies to principle-based frameworks that adapt as AI tools evolve.
Knowledge Access: Employees can now find information instantly rather than interrupting colleagues or searching through folder hierarchies.
Onboarding Acceleration: New team members access institutional knowledge without requiring extensive documentation or training time.
Creative Focus: "It makes work a purely creative pursuit," Chris envisions. "If you've got the information, it's there, now you do with it what you need to do with that."
Competitive Positioning: Gorilla can now confidently discuss AI capabilities with security-conscious energy clients, differentiating from competitors.
Scalable Framework: The governance structure established for Gemini applies to future AI tool adoption, whether Notion AI, Asana's AI features, or custom solutions.
Cultural Shift: AI interest group creates internal champions who drive adoption while remaining aware of risks.
Google Gemini respects Drive permissions but lacks granular controls for:
Solution: Layer third-party governance tools like Metomic on top of native capabilities.
Years of clicking "Get Link" to share files publicly within your organization creates a massive attack surface once AI indexing begins.
Solution: Audit and remediate existing permissions before deploying AI tools. Shift to direct sharing with specific users or groups.
A planning document from three years ago might now contain salary information, but original collaborators don't know the content changed.
Solution: Implement continuous monitoring and quarterly adversarial testing rather than one-time audits.
Even with perfect technical controls, employees need to understand:
Solution: Combine automated labeling with Slack notifications and training programs.
Not every department or use case requires AI access immediately.
Solution: Begin with low-risk teams, gather feedback, refine controls, then expand gradually.
Expanded Gemini Access: Rolling out to additional teams based on successful pilot program results.
Integration Management: Evaluating third-party integrations that feed data into Google Drive and ensuring they inherit appropriate permissions and controls.
RAG System Exploration: Investigating retrieval-augmented generation systems where different employee roles access different knowledge bases (e.g., intern bot vs. finance bot vs. operations bot).
Agentic AI: Eventually deploying AI agents that can take actions on behalf of employees, with strict scoping to prevent unauthorized data access.
Customer-Facing AI: Once internal controls are proven, exploring AI features within Gorilla's product offerings for energy retail clients.
Automated Data Curation: Moving toward systems that automatically identify, redact, and provision the right data for AI tools based on use case and user role.
Questions to Answer:
Create Two Groups:
Define Clear Boundaries:
Immediate Actions:
Ongoing Requirements:
Pilot Program:
Expansion:
Gorilla's approach to Gemini deployment demonstrates that enterprise AI adoption doesn't require choosing between security and productivity. By implementing robust data governance before AI enablement, organizations can unlock the full potential of tools like Gemini while protecting sensitive information.
The key insight: AI security is centred around ensuring AI accesses the right data at the right time for the right people.
As Chris concludes: "I would love to reach a point where everybody has exactly the information that they need to do their job. Nothing's living in someone's head. No one's having to constantly bug you on Slack. That would be amazing."
With the right governance framework, that vision is achievable—even in highly regulated industries like energy retail.