Lawrance Reddy · Microsoft MVP, AI Services · Durban
Foundry, in the field.
I work on Microsoft Azure AI Foundry. ConservAxion verifies clean-energy and biodiversity outcomes for KwaZulu-Natal households. Pfula is a bilingual isiZulu and English assistant for South African government services. This is where I write the engineering.
Flagship pieces
AI Services (Microsoft Foundry)
ConservAxion — verifying clean-energy and biodiversity impact with Azure AI Foundry
An Azure Foundry pilot for impact verification. Clean-energy and biodiversity outcomes, validated by AI, anchored in a tamper-evident audit trail, deployed in pilot households in KwaZulu-Natal.
AI Services (Microsoft Foundry)
Pfula — an isiZulu AI assistant for South African government services on Azure AI Foundry
A bilingual (isiZulu / English) government-services assistant built on Azure OpenAI and Foundry Agent Service — where the knowledge base is the tool-set and citizen-facing streaming is first-class.
More writing
-
From pilot to programme — an MVP's playbook for scaling a social-good AI project on Azure
Most impact pilots never graduate. Here's how I think about Azure credits, managed identity, Foundry, and the un-sexy operational scaffolding that lets a good idea survive contact with the real world.
-
Grounding agents in government policy — lessons from building Pfula
When your knowledge base is messy government documentation translated across languages, grounding is the whole product. A practical look at Foundry Agent Service tools, MCP, and the guardrails that keep a bilingual assistant honest.
-
AI for the offline last mile — IoT Hub, edge validation, and Foundry grounding for rural pilots
Bandwidth is expensive, connectivity is intermittent, and the device is probably a cheap phone. Here's the Azure pattern we use to run impact-verification agents on the edges of the network.
-
Verifiable impact — making green claims auditable with Azure Confidential Ledger
Carbon credits and biodiversity offsets only work if someone else can check the evidence later. A pattern for anchoring AI-generated verifications in a tamper-evident ledger.