How to Deploy Websites Generated by AI Builders on Your Server

How to Deploy Websites Generated by AI Builders on Your Server

AI website builders can spit out functional sites in minutes. Tools like Wix AI, TeleportHQ, and Framer generate clean HTML, CSS, and JavaScript based on simple prompts. But here’s the catch: most of these platforms want to lock you into their hosting ecosystem. You pay monthly, you follow their rules, and you’re stuck with whatever performance they give you.

If you’re running a VPS or dedicated server with InMotion Hosting, you already have more control and better performance than any shared platform can offer. This guide shows you exactly how to take code from AI website builders and deploy it to your own infrastructure.

Understanding What AI Website Builders Export

Before you deploy anything, you need to understand what different AI website builders actually export. Not all platforms give you the same output, and that affects your deployment strategy.

Most AI website builders fall into three categories:

  1. Static HTML/CSS/JavaScript generators like TeleportHQ and Relume export clean, production-ready code. These tools give you standard web files you can drop onto any server. No frameworks, no build process—just HTML, CSS, and vanilla JavaScript. This is the easiest type to deploy.
  2. Framework-based generators like Framer output React components or other modern framework code. These require a build step before deployment. You’ll need Node.js to compile the code into static files or run it as a web application.
  3. Platform-locked builders like Wix and Squarespace keep your code on their servers. These won’t work for this guide because you can’t export the underlying files. If you’re using one of these, you’ll need to switch to an exportable alternative.

For this guide, we’re focusing on exportable code—whether static or framework-based. Both types work on your VPS or dedicated server, but the deployment process differs slightly.

Exporting Code from AI Website Builders

Getting your code out of the AI builder is usually straightforward, but each platform handles it differently.

TeleportHQ offers direct code export. After generating your site, click the export button and download a ZIP file containing all your HTML, CSS, JavaScript, and assets. The structure is production-ready with organized directories for images, styles, and scripts.

Framer requires publishing first, then you can export the code. The exported files include React components and a Next.js configuration. You’ll need to run npm install and npm run build to create the static output.

Relume focuses on design handoff. Export your wireframes and style guides to Figma or Webflow, then convert those to production code. If you’re using Webflow, you can export the final HTML/CSS/JavaScript from there.

Once you have your files locally, inspect the structure. Look for:

  • An index.html file (your homepage)
  • CSS files in a /css or /styles directory
  • JavaScript files in a /js or /scripts directory
  • Images in an /images or /assets directory

If you see a package.json file, you’re working with a framework-based site that needs building before deployment.

Setting Up Your VPS or Dedicated Server

Your InMotion Hosting VPS or Dedicated Server needs proper configuration before hosting any website. We’ll use Ubuntu 24.04 as the base operating system and nginx as the web server. nginx outperforms Apache for static content delivery, and it’s the standard for modern web hosting.

Installing nginx

Connect to your server via SSH using your InMotion Hosting credentials:

ssh root@your-server-ip

Update your package lists and install nginx:

apt update
apt install nginx -y

Start nginx and enable it to run on boot:

systemctl start nginx
systemctl enable nginx

Test the installation by visiting your server’s IP address in a browser. You should see the default nginx welcome page. This confirms nginx is running and accessible.

Configuring the Firewall

Your server needs firewall rules to allow HTTP and HTTPS traffic while blocking everything else. Install and configure UFW (Uncomplicated Firewall):

apt install ufw -y
ufw allow 22/tcp    # SSH access
ufw allow 80/tcp    # HTTP traffic
ufw allow 443/tcp   # HTTPS traffic
ufw enable

Verify your firewall rules:

ufw status verbose

You should see your three allowed ports listed. Never skip the SSH port (22) or you’ll lock yourself out of your server.

Creating a Deployment User

Running everything as root is bad practice. Create a dedicated user for deployments:

adduser deploy
usermod -aG sudo deploy
usermod -aG www-data deploy

This creates a user named “deploy” with sudo privileges and adds them to the www-data group, which nginx uses for file access.

Set up SSH key authentication for this user. On your local machine, generate an SSH key if you don’t have one:

ssh-keygen -t ed25519 -C "[email protected]"

Copy the public key to your server:

ssh-copy-id deploy@your-server-ip

Test the connection:

ssh deploy@your-server-ip

If you connect without entering a password, your SSH keys are configured correctly.

Deploying Static Websites from AI Builders

Static sites are the simplest to deploy. They’re just files—no database, no application server, no complications.

Creating the Web Directory

Create a directory for your website. Use a structure that allows multiple sites on one server:

sudo mkdir -p /var/www/yourdomain.com
sudo chown -R deploy:www-data /var/www/yourdomain.com
sudo chmod -R 755 /var/www/yourdomain.com

This creates a directory owned by your deploy user but readable by nginx (which runs as www-data). The permissions (755) mean the owner can read/write/execute, while the group and others can only read/execute.

Uploading Your Files

Use rsync to transfer your exported website files to the server. From your local machine:

rsync -avz --delete /path/to/local/website/ deploy@your-server-ip:/var/www/yourdomain.com/

The flags mean:

  • -a: Archive mode (preserves permissions and timestamps)
  • -v: Verbose output so you see what’s happening
  • -z: Compress during transfer to save bandwidth
  • --delete: Remove files on the server that don’t exist locally

The trailing slash on the source path (/path/to/local/website/) is important. It copies the contents of the directory, not the directory itself.

Configuring nginx

Create an nginx configuration file for your site:

sudo nano /etc/nginx/sites-available/yourdomain.com

Add this configuration:

server {
    listen 80;
    listen [::]:80;

    server_name yourdomain.com www.yourdomain.com;
    root /var/www/yourdomain.com;
    index index.html;

    # Enable gzip compression
    gzip on;
    gzip_vary on;
    gzip_proxied any;
    gzip_comp_level 6;
    gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript image/svg+xml;

    # Cache static assets
    location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot)$ {
        expires 1y;
        add_header Cache-Control "public, immutable";
    }

    # Handle requests
    location / {
        try_files $uri $uri/ =404;
    }
}

This configuration handles both IPv4 and IPv6, enables compression for text-based files, and sets aggressive caching for static assets. The try_files directive tells nginx to look for exact file matches first, then try treating the URL as a directory, and finally return a 404 if nothing matches.

Enable the site by creating a symbolic link:

sudo ln -s /etc/nginx/sites-available/yourdomain.com /etc/nginx/sites-enabled/

Test your configuration for syntax errors:

sudo nginx -t

If the test passes, reload nginx:

sudo systemctl reload nginx

Setting Up SSL with Let’s Encrypt

Never run a website without HTTPS in 2025. Let’s Encrypt provides free SSL certificates with automated renewal.

Install Certbot:

sudo apt install certbot python3-certbot-nginx -y

Request a certificate:

sudo certbot --nginx -d yourdomain.com -d www.yourdomain.com

Certbot will ask for your email address and automatically configure SSL in your nginx setup. It modifies your configuration to redirect HTTP traffic to HTTPS and adds the necessary SSL directives.

Test automatic renewal:

sudo certbot renew --dry-run

If this succeeds, your certificates will renew automatically every 90 days. No manual intervention required.

Deploying Framework-Based Sites

AI tools like Framer generate React or Next.js applications. These need compilation before deployment.

Building the Application

If your exported code includes a package.json file, you’re working with a Node.js application. Install dependencies and build:

cd /path/to/exported/code
npm install
npm run build

This creates a production-optimized version of your site, usually in a build/, dist/, or .next/ directory. Check the documentation for your specific framework.

For Next.js applications, you have two deployment options: static export or Node.js server.

Static export converts your Next.js app to plain HTML/CSS/JavaScript:

npm run build

Then deploy the out/ directory following the static site instructions above.

Node.js server keeps your app dynamic, which you need for features like server-side rendering or API routes. This requires running Node.js on your server with a process manager like PM2.

Running Node.js Applications with PM2

Install Node.js on your server:

curl -fsSL https://deb.nodesource.com/setup_20.x | sudo -E bash -
sudo apt install -y nodejs

Install PM2 globally:

sudo npm install -g pm2

Upload your built application to your server, then start it with PM2:

cd /var/www/yourdomain.com
pm2 start npm --name "yourdomain" -- start
pm2 save
pm2 startup

Configure nginx as a reverse proxy. Edit your nginx configuration:

server {
    listen 80;
    server_name yourdomain.com www.yourdomain.com;

    location / {
        proxy_pass http://localhost:3000;
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection 'upgrade';
        proxy_set_header Host $host;
        proxy_cache_bypass $http_upgrade;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

This forwards all requests to your Node.js application running on port 3000. Add SSL using Certbot as described earlier.

Automating Deployments with GitHub Actions

Manual deployments work fine initially, but you’ll want automation once you start making changes. GitHub Actions can build and deploy your site automatically whenever you push code.

Setting Up Your Repository

Initialize a git repository in your local code directory:

cd /path/to/your/website
git init
git add .
git commit -m "Initial commit"

Create a repository on GitHub and push your code:

git remote add origin https://github.com/yourusername/yourrepo.git
git branch -M main
git push -u origin main

Creating Deployment Secrets

GitHub needs your SSH private key to connect to your server. Go to your repository settings, then Secrets and Variables → Actions.

Add these secrets:

  • SSH_PRIVATE_KEY: Your deploy user’s private SSH key
  • SSH_HOST: Your server’s IP address or domain
  • SSH_USER: Your deploy username (usually “deploy”)
  • DEPLOY_PATH: The deployment path on your server (e.g., /var/www/yourdomain.com)

To get your private key:

cat ~/.ssh/id_ed25519

Copy the entire output including the -----BEGIN and -----END lines.

Creating the Workflow

Create a .github/workflows/deploy.yml file in your repository:

name: Deploy to VPS

on:
  push:
    branches: [ main ]

jobs:
  deploy:
    runs-on: ubuntu-latest

    steps:
    - name: Checkout code
      uses: actions/checkout@v3

    - name: Deploy via rsync
      uses: burnett01/[email protected]
      with:
        switches: -avzr --delete
        path: ./
        remote_path: ${{ secrets.DEPLOY_PATH }}
        remote_host: ${{ secrets.SSH_HOST }}
        remote_user: ${{ secrets.SSH_USER }}
        remote_key: ${{ secrets.SSH_PRIVATE_KEY }}

For framework-based sites, add build steps before deployment:

name: Deploy to VPS

on:
  push:
    branches: [ main ]

jobs:
  deploy:
    runs-on: ubuntu-latest

    steps:
    - name: Checkout code
      uses: actions/checkout@v3

    - name: Setup Node.js
      uses: actions/setup-node@v3
      with:
        node-version: '20'

    - name: Install dependencies
      run: npm ci

    - name: Build application
      run: npm run build

    - name: Deploy via rsync
      uses: burnett01/[email protected]
      with:
        switches: -avzr --delete
        path: ./dist/
        remote_path: ${{ secrets.DEPLOY_PATH }}
        remote_host: ${{ secrets.SSH_HOST }}
        remote_user: ${{ secrets.SSH_USER }}
        remote_key: ${{ secrets.SSH_PRIVATE_KEY }}

Commit and push this workflow file. GitHub will automatically run it on every push to the main branch.

Security Considerations for AI Builder-Generated Code

Code from AI website builders introduces specific security risks you can’t ignore. Research from Veracode shows that 45% of AI-generated code contains security vulnerabilities, with Java having the highest failure rate at 72%. While AI builders produce more polished output than raw code generators, you still need to validate what they create.

Common Vulnerabilities in AI Builder Output

Input validation failures are widespread. AI builders often generate code that accepts user input without sanitization. If your site from an AI builder has any forms or URL parameters, inspect the JavaScript for proper validation.

Cross-site scripting (XSS) appears in 86% of AI-generated code samples. AI models don’t automatically escape user-generated content. Check any dynamic content rendering in your JavaScript.

Dependency issues happen when AI suggests outdated or non-existent packages. Always verify that imported libraries exist and are current. Run npm audit for Node.js projects to check for known vulnerabilities.

Overly permissive configurations occur because AI optimizes for functionality, not security. Review any CORS headers, API endpoints, or file permissions in your code.

Security Hardening Steps

Run security scans on your exported code before deployment:

# For Node.js projects
npm audit --audit-level=moderate

# For static sites, check for common issues
grep -r "eval(" ./
grep -r "innerHTML" ./
grep -r "document.write" ./

These patterns often indicate security problems. eval() can execute arbitrary code, innerHTML can introduce XSS vulnerabilities, and document.write() can cause various issues.

Implement Content Security Policy (CSP) headers in your nginx configuration:

add_header Content-Security-Policy "default-src 'self'; script-src 'self' 'unsafe-inline'; style-src 'self' 'unsafe-inline'; img-src 'self' data: https:; font-src 'self' data:;" always;
add_header X-Content-Type-Options "nosniff" always;
add_header X-Frame-Options "SAMEORIGIN" always;
add_header X-XSS-Protection "1; mode=block" always;

These headers prevent common attacks by restricting what resources your site can load and how browsers handle your content.

Regular Updates and Monitoring

Set up automatic security updates on your server:

sudo apt install unattended-upgrades -y
sudo dpkg-reconfigure --priority=low unattended-upgrades

Monitor your nginx logs for suspicious activity:

sudo tail -f /var/log/nginx/access.log
sudo tail -f /var/log/nginx/error.log

Look for patterns like repeated 404 errors, requests to uncommon file extensions, or attempts to access admin paths that don’t exist.

Performance Optimization

Code exported from AI website builders isn’t always optimized for performance. These adjustments will significantly improve load times.

Enable HTTP/2

HTTP/2 allows browsers to load multiple resources simultaneously. Edit your nginx SSL configuration:

listen 443 ssl http2;
listen [::]:443 ssl http2;

Reload nginx to apply the change.

Optimize Images

Sites from AI builders often include unoptimized images. Install image optimization tools on your server:

sudo apt install jpegoptim optipng -y

Optimize all images in your web directory:

find /var/www/yourdomain.com -name "*.jpg" -exec jpegoptim --strip-all --max=85 {} \;
find /var/www/yourdomain.com -name "*.png" -exec optipng -o2 {} \;

Consider implementing WebP format with fallbacks for better compression.

Configure Browser Caching

Add cache headers for all static resources in your nginx configuration:

location ~* \.(css|js|jpg|jpeg|png|gif|ico|svg|woff|woff2|ttf|eot)$ {
    expires 1y;
    add_header Cache-Control "public, immutable";
}

location ~* \.(html)$ {
    expires 1h;
    add_header Cache-Control "public, must-revalidate";
}

This caches assets for one year but checks HTML files every hour for updates.

Troubleshooting Common Issues

404 errors for CSS/JavaScript: Check file paths in your HTML. Code exported from AI builders sometimes uses absolute paths that won’t work on your server. Change /assets/style.css to assets/style.css (relative path).

Permission denied errors: Your deploy user needs write access and nginx needs read access. Fix with:

sudo chown -R deploy:www-data /var/www/yourdomain.com
sudo chmod -R 755 /var/www/yourdomain.com

Mixed content warnings: Your site loads over HTTPS but references HTTP resources. Update all asset URLs to use HTTPS or protocol-relative URLs (//example.com/image.jpg).

nginx configuration errors: Always test before reloading:

sudo nginx -t

This catches syntax errors before they break your live site.

GitHub Actions deployment failures: Check your SSH key format and ensure your deploy user has write permissions. Test SSH access manually:

ssh -i ~/.ssh/id_ed25519 deploy@your-server-ip "ls -la /var/www/yourdomain.com"

Moving Forward

Deploying websites from AI builders to your own infrastructure gives you control that platform-locked solutions can’t match. You manage performance, security, and costs directly. You can host dozens of sites on a single VPS or dedicated server with InMotion Hosting.

Start with static sites to learn the workflow. Once you’re comfortable with manual deployments, add GitHub Actions for automation. Test your security configuration regularly and keep your server updated.

The AI tools will keep improving, but the deployment fundamentals remain constant. Master nginx, understand SSH and rsync, and you’ll be able to deploy any AI-generated code to your own infrastructure.

Scalable VPS Infrastructure, Fully Managed

When shared hosting can't handle your traffic, VPS delivers dedicated resources that scale with demand. Our team manages the technical complexity while you manage your business.

check markNVMe Storage    check markHigh-Availability    check markIronclad Security    check markPremium Support

VPS Hosting

Share this Article
Carrie Smaha
Carrie Smaha Senior Manager Marketing Operations

Carrie enjoys working on demand generation and product marketing projects that tap into multi-touch campaign design, technical SEO, content marketing, software design, and business operations.

More Articles by Carrie