Business opportunity
Demand for housing in one high-growth country significantly outstrips supply, especially in large cities. To help ensure that its growing population can find safe, affordable places to live, the country needed to accelerate residential construction. However, long and complex planning processes were causing delays at its ministry of urban affairs.
To answer vital policy and contract questions around new housing developments, legal and operational teams were spending significant time searching through large sets of documents in two languages. Much of the manual effort, which involved finding information and creating reports, was duplicated from project to project.
Technical challenge
The ministry wanted to create a “ChatGPT-like” generative-AI solution to provide natural-language answers to policy and contract questions.
The solution needed to work with complex ministry documents in two languages, providing accurate responses with traceable citations to source material. The ability to work with multiple AI models was a must, as was the need to ensure data security and privacy.
With limited in-house machine-learning operations (MLOps) experience, the ministry aimed to increase its capacity by working with an external partner. The goal was to create a Gen AI foundation that could scale across multiple departments, from legal to marketing and engineering.
Our solution
Together, the ministry and Kyndryl built a Gen AI-powered chatbot on Amazon Bedrock to accelerate the housing process. It provides capabilities for contract analysis and insights, project documentation and knowledge search, secure, bilingual legal operations, as well as decision support and recommendations. Anthropic Claude large language models (LLMs) are used for legal reasoning and the ministry added an AI21 model with stronger capabilities for one of the more difficult languages.
Using the AWS Gen AI LLM Chatbot reference solution enabled rapid deployment of a production-ready base. The addition of ministry documents and web sources in a Retrieval-Augmented Generation (RAG) pattern enables the chatbot to draw information directly from trusted sources and provide citations, improving accuracy and reducing employee efforts by creating reliable first drafts of reports.
A serverless backbone of AWS Lambda, Amazon S3 and Amazon API Gateway simplifies the operation and maintenance of the solution.