Mosaic

Enhancing Mosaic’s resource management tool with an AI recommendation engine

AI
SaaS

Our Impact

We helped Mosaic deploy an AI recommendation engine for their resource allocation product and stood up the infrastructure on GCP.
  1. Invested in improving their SaaS product with cutting-edge tech on a cloud-native infrastructure
  2. Built the infrastructure on GCP to match and connect to their AWS setup, while meeting their needs on multi-tenancy security 
  3. Helped seamlessly integrate large language models (LLMs) into the recommendation engine for their resource management software

Like what you see? Let’s chat about your next project.

Mosaic is a resource management platform that companies use to manage their people across projects.

Users can distribute work based on employee skills and availability, as well as client or project needs. It provides visibility on where people are deployed and when they are being underutilized, and can be harnessed for forecasting and capacity planning. 

THE VISION

Mosaic was interested in setting up their AI infrastructure on GCP, while following conventions and patterns that were similar to their setup on AWS. They also needed to meet their goals around multi-tenancy security compliance. The AI would empower their platform to coach the user with astute recommendations on project assignments. 

WHY LAZER?

Mosaic needed our expertise in both Google Cloud and AI best practices. They chose us because of our high-calibre engineering talent, our strong product capabilities, and how fast we could iterate on both projects.

Approach

Bootstrapping the AI infrastructure

Standing up the infra on GCP: We built the GCP environment up to their standards and in compliance with SOC 2 (Service Organization Control Type 2). We securely connected their AWS data to GCP through a VPN, while setting up Google Kubernetes Engine (GKE) to receive their compute workloads for multiple environments.

Meeting strict security requirements: We created a single and multi-tenancy (hybrid-tenancy) infrastructure within the SOC 2 compliance to host clients with strict security requirements, such as government clients. This included end-to-end encryption in all the moving parts, backups, and a highly available system for resilience.

Approach

Enhancing recommendations for the user

We advised Mosaic on how to best harness AI and how to choose the LLMs they wanted to integrate with their engine.

In their tool, the user would automatically see augmented recommendations based on project or employee data — not only who to put on a project, but what projects to give available employees. The recommendations also needed to extract employee skills to match them teams and projects. We worked closely with their team on deployment and produced proofs of concept to guide their engineers to achieving the ideal user experience.

Approach

Tuning the LLMs to Mosaic's needs

We compared various providers and models, including OpenAI, Google Gemini, and Meta Llama, in terms of speed, cost, and quality of results.

We also considered OpenAI’s Assistant, which was in beta. Mosaic chose to move forward with the OpenAI models, and we worked with many over the course of the project, including GPT-3.5, GPT-4.0, and GPT-4o.

We helped fine-tune the prompts that went into the LLMs and then used the data sets to create the desired frontend output. Their team then stood the process up in their environment in order to reproduce the results. Throughout this process we supported them with effective solutions and they executed them to ensure they would be consistent and predictable.

Mosaic was happy with our delivered results and pace; we wrapped the infrastructure work up quicker than expected and that left more project time to help with the LLM work.

Mosaic has been piloting their AI features in beta and plans to roll them out to their full client base soon.

Ready to make an impact?

📎 Copied our email address, founders@lazertechnologies.com
to your clipboard. 😊

Let's Talk

founders@lazertechnologies.com

Thank you.

We'll reach out to you soon.
Oops! Something went wrong while submitting the form.