Listing Thumbnail

    IronCloud LLM - Private Self-Hosted ChatGPT-Style AI | NIST & CMMC Ready

     Info
    Deployed on AWS
    AWS Free Tier
    This product has charges associated with it for the IronCloud LLM platform and support. Deploy your own secure, ChatGPT-style AI in minutes - fully private, self-hosted, and compliant with NIST 800-171, CMMC, and ITAR. IronCloud LLM runs entirely inside your AWS environment, keeping all data, prompts, and files under your control. No SaaS, no data leaks - just powerful AI you own. Built for regulated industries such as government, defense, finance, and healthcare, IronCloud LLM supports GPT-4, Claude, Gemini, AWS Bedrock, and more.

    Overview

    Open image

    This is a repackaged software product wherein additional charges apply for the IronCloud LLM platform and support. Charges cover the IronCloud LLM deployment framework, compliance hardening, packaging, and ongoing updates. These charges do not include access to third-party APIs. Customers must supply their own credentials if they choose to connect to optional external AI providers.

    IronCloud LLM makes it simple to deploy a ChatGPT-style AI assistant directly into your AWS environment without giving up privacy, compliance, or control. Designed for regulated industries, this AMI launches a secure, self-hosted AI interface that can connect to top models like GPT-4, Claude, and Gemini while keeping prompts, files, and conversation history within your AWS account.

    Originally built for government and defense, IronCloud LLM also powers financial services, healthcare, manufacturing, and other sectors that demand strict data protection. It runs in any AWS region, including GovCloud, and is built to align with NIST 800-171, CMMC, ITAR, HIPAA, and similar compliance frameworks.

    External API Usage Disclosure: IronCloud LLM includes a functional deployment out of the box. Customers can immediately launch the instance, access the web interface, and use the included components without requiring any external services. Optionally, customers may configure connections to external AI APIs (such as OpenAI, Anthropic, or AWS Bedrock) by providing their own credentials. IronCloud LLM does not provide, resell, or bundle these third-party services.

    Key features:

    Fully private chat interface. Your prompts and files stay in your AWS environment.

    Supports leading models: GPT-4, Claude, Gemini, Bedrock, Azure OpenAI, and more.

    Multimodal: chat with documents, images, and enterprise data sources.

    Multi-user support with secure authentication and audit logging.

    Flexible deployment in GovCloud or standard AWS regions.

    Quick launch. Ready to use in minutes with minimal setup.

    Whether you are streamlining engineering workflows, securing sensitive client data, or deploying AI within compliance-bound operations, IronCloud LLM delivers modern AI power under your control, deployed in your cloud, on your terms.

    Highlights

    • Launch a private, self-hosted ChatGPT-style AI in your AWS account with no SaaS dependency and full data control. You own the stack and the security. Perfect for organizations that demand privacy without sacrificing modern AI capabilities.
    • Built for regulated industries, IronCloud LLM aligns with NIST 800-171, CMMC, ITAR, HIPAA, and SOC 2 controls. Runs in standard AWS regions or GovCloud while supporting GPT-4, Claude, Gemini, and Bedrock models.
    • Go beyond basic chat with multimodal AI that can search, summarize, and analyze your documents, images, and enterprise data sources. Includes multi-user authentication, role-based permissions, audit logging, and a clean, intuitive interface.

    Details

    Delivery method

    Delivery option
    64-bit (x86) Amazon Machine Image (AMI)

    Latest version

    Operating system
    AmazonLinux 2023

    Deployed on AWS

    Unlock automation with AI agent solutions

    Fast-track AI initiatives with agents, tools, and solutions from AWS Partners.
    AI Agents

    Features and programs

    Financing for AWS Marketplace purchases

    AWS Marketplace now accepts line of credit payments through the PNC Vendor Finance program. This program is available to select AWS customers in the US, excluding NV, NC, ND, TN, & VT.
    Financing for AWS Marketplace purchases

    Pricing

    IronCloud LLM - Private Self-Hosted ChatGPT-Style AI | NIST & CMMC Ready

     Info
    Pricing is based on actual usage, with charges varying according to how much you consume. Subscriptions have no end date and may be canceled any time.
    Additional AWS infrastructure costs may apply. Use the AWS Pricing Calculator  to estimate your infrastructure costs.

    Usage costs (575)

     Info
    • ...
    Dimension
    Cost/hour
    t3.large
    Recommended
    $0.965
    t2.micro
    AWS Free Tier
    $0.965
    t3.micro
    AWS Free Tier
    $0.965
    r6a.8xlarge
    $0.965
    c5n.9xlarge
    $0.965
    m5.12xlarge
    $0.965
    r6a.large
    $0.965
    m7i-flex.xlarge
    $0.965
    r5d.4xlarge
    $0.965
    r6in.24xlarge
    $0.965

    Vendor refund policy

    Refunds are available within 7 days of purchase for deployment or technical issues that cannot be resolved by IronCloud support. Contact support@ironcloudllm.com  with your order ID and issue details to request a refund.

    How can we make this page better?

    We'd like to hear your feedback and ideas on how to improve this page.
    We'd like to hear your feedback and ideas on how to improve this page.

    Legal

    Vendor terms and conditions

    Upon subscribing to this product, you must acknowledge and agree to the terms and conditions outlined in the vendor's End User License Agreement (EULA) .

    Content disclaimer

    Vendors are responsible for their product descriptions and other product content. AWS does not warrant that vendors' product descriptions or other product content are accurate, complete, reliable, current, or error-free.

    Usage information

     Info

    Delivery details

    64-bit (x86) Amazon Machine Image (AMI)

    Amazon Machine Image (AMI)

    An AMI is a virtual image that provides the information required to launch an instance. Amazon EC2 (Elastic Compute Cloud) instances are virtual servers on which you can run your applications and workloads, offering varying combinations of CPU, memory, storage, and networking resources. You can launch as many instances from as many different AMIs as you need.

    Version release notes

    Initial GA release of IronCloud LLM AMI.

    Amazon Linux 2023

    Docker and Docker Compose preinstalled

    LibreChat v0.7.9 configured to run via Docker Compose

    Health check script and systemd unit provided

    Quick start docs on instance at /opt/ironcloud/README.md

    Security-hardened defaults: SSH only, app port closed until opened in Security Group

    Additional details

    Usage instructions

    Usage Instructions

    Launching the Instance Deploy the AMI from AWS Marketplace using the recommended instance type t3.large or larger for production workloads. Assign at least 40 GB gp3 storage. Launch the AMI in a VPC with outbound internet access if you plan to configure optional external APIs.

    Configure Security Group Rules Open only required ports: 22/TCP for SSH, restricted to your admin IP or CIDR. 3080/TCP for the IronCloud LLM web UI, restricted to trusted IP ranges.

    If using HTTPS termination directly on the instance, open 443/TCP and install TLS certificates via a reverse proxy such as Nginx or Traefik.

    Connect via SSH Download the EC2 key pair you selected at launch. From your workstation run: ssh -i your-key.pem ec2-user@instance-public-ip

    Default OS user is ec2-user on Amazon Linux 2023.

    Application File Locations The application stack is installed in /opt/ironcloudllm. Key subdirectories include: /opt/ironcloudllm/docker-compose.yml - Compose file for all services. /opt/ironcloudllm/.env - Environment variables and configuration overrides. /opt/ironcloudllm/logs/ - Application logs. /opt/ironcloudllm/client/public/assets/ - Static assets such as logo and favicon.

    Starting the Application cd /opt/ironcloudllm sudo docker compose pull sudo docker compose up -d

    This starts the LibreChat-based web UI, API backend, and supporting services.

    Access the Web Interface Open a browser and visit: http://instance-public-ip:3080 

    When you access the application for the first time, click Sign up to create an account. The first account created automatically becomes the administrator. The admin account manages agents, prompts, and memory permissions.

    Local Functionality (Out of the Box) The AMI includes a working chat interface and RAG (retrieval augmented generation) features that are functional immediately after launch without any external services.

    Optional External Integrations From the web UI, go to Settings > API Keys and enter credentials for providers such as AWS Bedrock, OpenAI, Azure OpenAI, or Anthropic.

    These integrations are optional and not required for core functionality.

    Customers must provide their own API credentials.

    IronCloud LLM does not provide, resell, or bundle third-party API access.

    Updating the Stack cd /opt/ironcloudllm sudo docker compose pull sudo docker compose up -d

    Stopping or Restarting Stop: cd /opt/ironcloudllm && sudo docker compose down Restart: sudo docker compose restart

    Logs and Troubleshooting View logs: cd /opt/ironcloudllm && sudo docker compose logs -f Check container health: sudo docker ps

    Security Best Practices Limit inbound access to ports 22 and 3080 to trusted IPs. Use an Application Load Balancer or reverse proxy for HTTPS/TLS termination. Regularly update both the OS and Docker images.

    Documentation Expanded documentation is available at https://ironcloudllm.com/docs 

    Support

    Vendor support

    Email: support@ironcloudllm.com  Website: https://ironcloudllm.com/support 

    IronCloud provides email-based technical support for all AWS Marketplace subscribers, covering deployment assistance, configuration guidance, and troubleshooting for your IronCloud LLM AMI. Support is available Monday through Friday, 8:00 AM to 6:00 PM Pacific Time, with a typical response time of one business day. Urgent production-impacting issues receive priority handling. Documentation, setup guides, and FAQs are available online 24/7.

    AWS infrastructure support

    AWS Support is a one-on-one, fast-response support channel that is staffed 24x7x365 with experienced and technical support engineers. The service helps customers of all sizes and technical abilities to successfully utilize the products and features provided by Amazon Web Services.

    Similar products

    Customer reviews

    Ratings and reviews

     Info
    0 ratings
    5 star
    4 star
    3 star
    2 star
    1 star
    0%
    0%
    0%
    0%
    0%
    0 AWS reviews
    No customer reviews yet
    Be the first to review this product . We've partnered with PeerSpot to gather customer feedback. You can share your experience by writing or recording a review, or scheduling a call with a PeerSpot analyst.