Nitesh_Jain presents a comprehensive sample demonstrating how to add enterprise-grade Azure AI scenarios to Flask apps, including conversational, reasoning, and multimodal AI—all with secure App Service integration and automated deployment.

Azure App Service AI Scenarios: Complete Sample with AI Foundry Integration

This guide shows developers how to quickly add advanced AI capabilities to web apps using Azure App Service and Azure AI Foundry. The sample code and instructions focus on integrating with Flask but can be adapted for similar Python-based web projects.

Key AI Scenarios Implemented

  • Conversational AI: Natural language processing with context and session management
  • Reasoning Models: Step-by-step problem-solving features
  • Structured Output: JSON responses for integrations and schema validation
  • Multimodal Processing: Run image analysis and audio transcription using vision and audio models
  • Enterprise Chat: Prebuilt assistant with retail scenarios and business intelligence features

Quick Start—Automated Azure Deployment

Prerequisites:

Deployment Models: Supports both new and existing endpoints. Recommended models: gpt-4o-mini, gpt-35-turbo. Some advanced models may be region-limited.

Install and Deploy:

git clone https://github.com/Azure-Samples/azure-app-service-ai-scenarios-integrated-sample.git
cd azure-app-service-ai-scenarios-integrated-sample
azd up

This installs, configures, and deploys all dependencies, setting up managed identities and provisioning required resources.

Deployment Prompts:

  • Choose resource group and location
  • Specify or create Azure AI Foundry endpoints
  • Assign managed identity and select model names

What Gets Deployed

  • Azure App Service (Basic B2, Python 3.11)
  • Azure AI Foundry resources: AI project workspace, storage account, model deployments if creating new
  • Managed identity for secure authentication, role-based access
  • Cognitive Services roles assigned to App Service
  • Environment variables and permissions automatically set

Local Development Setup

  • Python 3.8+ required
  • Install requirements:
pip install -r requirements.txt
python app.py
  • Use http://localhost:5000 or /settings for configuration
  • Provide Azure AI Foundry endpoint and API key; for production, managed identity is used
  • Review Setup Guide for deploying Azure AI Foundry models

Testing Core Features

  • Access floating chat popup on homepage
  • Test conversational AI, product inquiries, and customer service flows
  • Upload test images and audio files for multimodal analysis
  • Validate structured reasoning by entering complex business questions
  • Each response should be relevant and demonstrate selected scenario capabilities

Integration with Existing Flask Applications

Required Azure Resources

  • Azure AI Foundry endpoint with deployed models

Integration Steps

  1. Enable Managed Identity
  2. Assign roles for Cognitive Services and Azure AI Developer
  3. Copy AIPlaygroundCode folder to your app
  4. Merge dependencies in requirements.txt (e.g. flask, azure-identity, openai, pillow, pydub)
  5. Add and configure environment variables for inference endpoints and model names
  6. Add /settings route and templates for AI configuration
  7. Integrate chat interface HTML, CSS, and JavaScript into your app templates
  8. Test that the chat popup, file uploads, and all AI scenarios work end-to-end

Cleaning Up Resources

  • Use azd down to remove deployed resources when finished
  • Or delete resource groups directly from Azure Portal
  • Resource deletion may take up to 10 minutes

Cost and Security Guidance

Reference Resources

Support

  • File issues and feature requests via GitHub Issues or Discussions

Important Notices

  • This is a developer sample. Consider enabling additional security and monitoring before using in production.

Author: Nitesh_Jain

This post appeared first on “Microsoft Tech Community”. Read the entire article here