Table of Contents

0-vercel-header-image.png

Deploying web apps on Vercel is a breeze—Next.js, serverless functions, and edge middleware all come together seamlessly, whether you’re launching a quick side project or scaling a production-grade application. But once your app’s live, how do you keep an eye on what’s happening behind the scenes? Are your functions timing out? Is middleware misbehaving? Vercel’s dashboard offers a basic peek with request stats, but it leaves out the juicy details—like console.log outputs or build error specifics—that you need for serious debugging. That’s where a robust observability solution comes in handy, making it easy to spot issues and optimize performance.

In this comprehensive guide, we’ll walk you through monitoring Vercel logs with OpenObserve using a serverless function deployed right in your Vercel project. We’ll cover all the key steps: configuring the log drain, testing the flow, and troubleshooting any hiccups that may occur along the way.

Why OpenObserve for Vercel Log Monitoring?

Vercel’s Monitoring tab offers useful information at a quick glance—request counts, error statuses—but it’s not designed for in-depth troubleshooting. Need to see what your code is logging or why a build failed? You’re left digging through limited data. OpenObserve fills this gap by collecting every log your Vercel app produces: serverless function outputs, edge request details, build logs, and more. Its columnar compression slashes storage costs compared to tools like Elasticsearch, while its high-speed queries enable real-time analysis. With features like custom dashboards and alerts, OpenObserve scales seamlessly with your app, from small projects to high-traffic production environments.

For Vercel users, this means catching a timeout in the iad1 region or debugging middleware issues without piecing together clues from Vercel’s native interface. It’s observability that’s powerful, affordable, and tailored to your needs.

Prerequisites

Before we get started, let’s ensure you have everything in place:

  • A Vercel account with a deployed application. I’ll use a Next.js project like blog-starter-kit as an example, but any Vercel app works—it just needs to be live and generating logs.
  • An OpenObserve instance up and running. Select between Cloud or Self-hosted options here.
  • A Vercel Pro or Enterprise plan—Hobby plans don’t support Log Drains. Check your tier at vercel.com/account/plan.
  • Basic comfort with Vercel’s dashboard and access to a Terminal.

Got everything? Let’s get started.

Send Vercel Logs to OpenObserve

Vercel Log Drains allow you to forward logs—such as function executions or build outputs—to an external endpoint. To avoid re-verification issues when changing URLs, we’ll deploy a serverless function within your Vercel project as the Log Drain endpoint. This function will verify requests from Vercel and forward logs to OpenObserve. Here’s the detailed process.

1.1 Obtain OpenObserve Endpoint and Authentication Token

To send logs to OpenObserve, you need its ingestion endpoint and an API key for authentication. Follow these steps to retrieve them:

  1. Log Into OpenObserve:
  2. Locate the Ingestion Endpoint:
    • In the left sidebar, navigate to Data Sources → Custom → Logs to obtain your ingestion endpoint and authorization token.

The endpoint will look like this:

https://api.openobserve.ai/api/<your-org-id>/<your-stream-name>/_json
  • <your-org-id>: Your organization ID, found under “Settings > Organization” (e.g., nitya_organization_50763_nJO9encbUWsGmSu). Self-hosted instances default to default unless modified.
  • <your-stream-name>: Leave as “default” or customize your log stream name (recommended) to vercel_logs or similar.
  • Example: https://api.openobserve.ai/api/nitya\_organization\_50763\_nJO9encbUWsGmSu/vercel\_logs/\_json
  • Copy the full URL, including /_json, which indicates that logs are sent as JSON arrays.

The authorization token will look like this:

Basic bml0eWFAb3Blbm9ic2VydmUuYWk6ODAyZ1ozdW80TjVTOTE3czZNZWQ=

1-o2-data-sources.gif

  1. Test the Endpoint (Optional):

Verify connectivity with a simple test:

curl -X POST -H "Authorization: Basic <your-api-key>" -H "Content-Type: application/json" \
-d '[{"test": "hello"}]' https://api.openobserve.ai/api/<your-org-id>/vercel_logs/_json

A 200 or 204 response confirms success. If you get a 400, ensure your JSON is an array ([{"key": "value"}]). A 401 indicates an incorrect API key, and a 404 suggests an error in the org ID or stream name.

2-curl-test.gif

1.2 Deploy a Vercel Function

Create a serverless function within your Vercel project to serve as the Log Drain endpoint, handling verification and log forwarding:

A. Create the Function File

In your project’s root directory (e.g., blog-starter-kit), create a file at api/logs.js with the following code. Modify it as needed for a project utilizing TypeScript or another language:

// Vercel Function for log drain to OpenObserve
const https = require('https');

// Configuration
// Replace with your own
const OPENOBSERVE_URL = 'https://api.openobserve.ai/api/<your-org-id>/vercel_logs/_json';
const API_KEY = 'bml0eWFAb3Blbm9ic2VydmUuYWk6ODAyZ1ozdW80TjVTOTE3czZNZWQ=';
// You can get this from the Log Drain setup process, detailed in /1.3 below
const VERCEL_VERIFY_HEADER = 'fd45a95690b57b4b161f0b4413ce086ac5255783';

// Helper function to log details
function logInfo(message, data) {
  console.log(`INFO: ${message}`, JSON.stringify(data || {}));
}

// Helper function to send data to OpenObserve
async function sendToOpenObserve(data) {
  return new Promise((resolve, reject) => {
    const body = JSON.stringify(data);
    
    const options = {
      method: 'POST',
      headers: {
        'Authorization': `Basic ${API_KEY}`,
        'Content-Type': 'application/json',
        'stream-name': 'vercel_logs'
      }
    };
    
    logInfo('Sending to OpenObserve', { dataSize: body.length });
    
    const req = https.request(OPENOBSERVE_URL, options, (res) => {
      let responseData = '';
      
      res.on('data', (chunk) => {
        responseData += chunk;
      });
      
      res.on('end', () => {
        logInfo('OpenObserve response', { 
          status: res.statusCode,
          response: responseData
        });
        
        if (res.statusCode >= 200 && res.statusCode < 300) {
          resolve({ success: true, status: res.statusCode, data: responseData });
        } else {
          reject(new Error(`OpenObserve returned status ${res.statusCode}: ${responseData}`));
        }
      });
    });
    
    req.on('error', (error) => {
      logInfo('Error sending to OpenObserve', { error: error.message });
      reject(error);
    });
    
    req.write(body);
    req.end();
  });
}

// Main handler function
module.exports = async (req, res) => {
  // Log basic request info
  logInfo('Request received', {
    method: req.method,
    url: req.url,
    headers: req.headers,
    query: req.query
  });
  
  // Always include verification header
  res.setHeader('x-vercel-verify', VERCEL_VERIFY_HEADER);
  
  // Handle verification requests
  if (req.method === 'GET' || req.method === 'HEAD') {
    logInfo('Handling verification request');
    return res.status(200).send('Verified');
  }
  
  // Handle log forwarding
  if (req.method === 'POST') {
    try {
      // Check if this is a verification POST
      if (req.headers['x-vercel-verify'] === 'true') {
        logInfo('Handling verification POST request');
        return res.status(200).send('Verified');
      }
      
      // Process log data
      if (!req.body) {
        logInfo('No body in request');
        return res.status(200).send('No data to process');
      }
      
      // Log the received data (truncated for readability)
      const bodyStr = JSON.stringify(req.body);
      logInfo('Received log data', { 
        preview: bodyStr.substring(0, 200) + (bodyStr.length > 200 ? '...' : ''),
        size: bodyStr.length
      });
      
      // Prepare data for OpenObserve
      const logData = {
        ...req.body,
        timestamp: req.body.timestamp || new Date().toISOString(),
        _meta: {
          received_at: new Date().toISOString(),
          source: 'vercel',
          function: 'log-drain'
        }
      };
      
      // Send to OpenObserve
      try {
        const result = await sendToOpenObserve(logData);
        logInfo('Successfully forwarded to OpenObserve', result);
      } catch (error) {
        logInfo('Failed to send to OpenObserve', { error: error.message });
        // Continue execution - we still want to return 200 to Vercel
      }
      
      // Always return success to Vercel to prevent retries
      return res.status(200).send('Logs processed');
    } catch (error) {
      logInfo('Error processing request', { error: error.message, stack: error.stack });
      // Still return success to prevent Vercel from retrying
      return res.status(200).send('Error processing request');
    }
  }
  
  // Handle any other method
  logInfo('Unhandled method', { method: req.method });
  return res.status(200).send('Method accepted');
};
  • Replace <your-org-id> with your organization ID (e.g., nitya_organization_50763_nJO9encbUWsGmSu).

  • Replace <your-api-key> with your API key (e.g., bml0eWFAb3Blbm9ic2VydmUuYWk6ODAyZ1ozdW80TjVTOTE3czZNZWQ=).

  • Replace <your-vercel-verify-token> with the token Vercel provides during Log Drain setup (e.g., fd45a95690b57b4b161f0b4413ce086ac5255783).

  • Replace <your-log-drain-secret> with the secret Vercel provides after creating the Log Drain (e.g., hRcNbZt4MxtDphbDejihnYr8X). You’ll get this secret once the Log Drain is created in step 1.2.3.

B. Deploy to Vercel

Add the file, commit, and push:

git add api/logs.js
git commit -m "Add log drain endpoint for OpenObserve"
git push

Once the project is deployed on Vercel, note your logs endpoint, which will look like: https://<your-app>.vercel.app/api/logs (e.g., https://blog-starter-kit-silk-nine.vercel.app/api/logs).

1.3 Configure Log Drain

Now, you are ready to use that endpoint (i.e., https://<your-app>.vercel.app/api/logs) to configure a log drain to direct logs for your Vercel application to OpenObserve:

A. Configure the Log Drain in Vercel

  • Log into vercel.com, and go to “Settings” > “Log Drains.”
  • Click “Add Log Drain” > “Custom Log Drain.”
  • Fill out the form:
    1. Sources: Check “Function” and “Edge” for runtime logs; add “Build” for deployment logs if desired.
    2. Delivery Format: Select “JSON”—Vercel’s default, and OpenObserve handles it natively.
    3. Name: vercel_openobserve_logs—keeps it organized.
    4. Environments: Choose “Production” to target your live app.
    5. Sampling Rate: Set to 100% to capture all logs (adjust later for high-traffic apps).
    6. Endpoint URL: Enter your log endpoint URL from 1.2 above (e.g., https://blog-starter-kit-silk-nine.vercel.app/api/logs).
  • Skip headers—the function manages authentication and verification.
  • Click “Save” to initiate verification.

Note that configuring a Log Drain on Vercel involves the following requirement, which we have already accounted for in 1.2 above:

Verify URL ownership by responding with status code `200`and the following header: `<x-vercel-verify: fd45a95690b57b4b161f0b4413ce086ac5255783>`

4-setup-log-drain.gif

B. Secure the Log Drain with Vercel’s Secret (optional)

After saving, Vercel confirms the Log Drain creation and provides a secret for securing requests, along with a code snippet for verifying the x-vercel-signature header:

Secret: hRcNbZt4MxtDphbDejihnYr8X

Note: This secret will not be shown again, so copy it immediately.

Vercel also provides a verification snippet:

const crypto = require('crypto');

async function verifySignature(req) {
  const signature = crypto
    .createHmac('sha1', process.env.LOG_DRAIN_SECRET)
    .update(JSON.stringify(req.body))
    .digest('hex');
  return signature === req.headers['x-vercel-signature'];
}

a. Update api/logs.js with this secret:

  • i. Replace <your-log-drain-secret> with the secret (e.g., hRcNbZt4MxtDphbDejihnYr8X).
  • ii. Alternatively, set it as an environment variable in Vercel (LOG_DRAIN_SECRET) and modify the code to use process.env.LOG_DRAIN_SECRET.

b. Redeploy:

git commit -m "Add Log Drain secret for verification" && git push

This ensures all Log Drain requests are verified, securing your endpoint in production.

C. Test Log Drain

Once you have set up your Log Drain, you can test it to ensure it’s working as expected:

5-test-log-drain.gif

If the test log drain is successful, you should see a “Test log drain sent successfully” message. If it fails, see the troubleshooting section.

1.4 View Logs in OpenObserve

Now, navigate to “Logs” in OpenObserve to view the Vercel logs. Select the vercel_logs stream. Then, you should see logs like:

{
  "_timestamp": 1741811750773397,
  "0_branch": "main",
  "0_deploymentid": "dpl_4moaAZzTC7YWvjPVYfvFTFN5iQA1",
  "0_environment": "production",
  "0_executionregion": "iad1",
  "0_host": "blog-starter-kit-silk-nine.vercel.app",
  "0_id": "85082895730174181174596118100000",
  "0_level": "info",
  "0_message": "INFO: Request received {\"method\":\"POST\",\"url\":\"/api/logs\",...}",
  "0_path": "/api/logs",
  "0_projectid": "prj_2J00dsibvUf7qtUv0fVnrxnnvvNb",
  "0_projectname": "blog-starter-kit",
  "0_proxy_clientip": "100.27.231.66",
  "0_proxy_host": "blog-starter-kit-silk-nine.vercel.app",
  "0_proxy_lambdaregion": "iad1",
  "0_proxy_method": "POST",
  "0_proxy_path": "/api/logs",
  "0_proxy_pathtype": "streaming_func",
  "0_proxy_region": "iad1",
  "0_proxy_scheme": "https",
  "0_proxy_timestamp": 1741811745805,
  "0_proxy_vercelcache": "MISS",
  "0_requestid": "mxdfx-1741811745805-8605a0d9c614",
  "0_source": "lambda",
  "0_timestamp": 1741811745961,
  "0_type": "stdout"
}

These logs include metadata (e.g., proxy_clientip, requestid) and your app’s outputs—ideal for debugging.

6-view-logs-o2.gif

Now that your logs are correctly streaming into OpenObserve, you can process them using pipelines, visualize them using interactive dashboards, or set up custom alerts to proactively assess and mitigate potential issues with your application.

1.5 Troubleshoot Common Vercel Log Monitoring Issues

Encountering issues? Here’s a detailed guide to diagnose and resolve them, using your log examples:

Verification Fails

  • Symptom: Vercel reports, “URL verification failed. Please make sure to respond with a 200 status code and the provided header.”
  • Diagnosis: Check function logs in Vercel’s “Functions” tab for:
INFO: Handling verification request {}

or

INFO: Handling verification POST request {}

Missing logs? Vercel isn’t reaching your endpoint.

git commit -m "Fix verification token" && git push

Re-save the Log Drain.

Signature Verification Fails

  • Symptom: Logs show:
INFO: Signature verification failed {}
  • Diagnosis: The x-vercel-signature header doesn’t match the HMAC-SHA1 hash.
  • Fix: Verify LOG_DRAIN_SECRET matches the secret Vercel provided (e.g., hRcNbZt4MxtDphbDejihnYr8X). If using environment variables, ensure it’s set in Vercel’s dashboard under “Settings” > “Environment Variables.” Redeploy:
git commit -m "Fix Log Drain secret" && git push

No Logs in OpenObserve

  • Symptom: vercel_logs stream is empty despite a passing test.
  • Diagnosis: Look for:
INFO: Sending to OpenObserve {"dataSize": 5212}

INFO: OpenObserve response {"status":200,"response":"{\"code\":200,\"status\":[{\"name\":\"vercel_logs\",\"successful\":1,\"failed\":0}]}"}

Missing or showing errors (e.g., status: 400)? Forwarding failed.

  • Fix: Check OPENOBSERVE_URL and API_KEY in api/logs.js. Test:
curl -X POST -H "Authorization: Basic <your-api-key>" -H "Content-Type: application/json" \ -d '[{"test": "hello"}]' https://api.openobserve.ai/api/<your-org-id>/vercel_logs/_json

Redeploy if corrected. Confirm logs in Vercel’s Monitoring tab (e.g., Hello from Vercel at...).

504 Timeout Errors

  • Symptom: Logs show:
"0_level": "error",
"0_message": "START RequestId: d84b9821-0f1e-4256-b36e-c39748fea63e\n[POST] /api/logs status=504\nEND RequestId: d84b9821-0f1e-4256-b36e-c39748fea63e\nREPORT RequestId: d84b9821-0f1e-4256-b36e-c39748fea63e Duration: 15000 ms..."
  • Diagnosis: Vercel Pro functions timeout at 15 seconds.
  • Fix: Optimize sendToOpenObserve—trim payloads (logs show 5212 bytes). Or upgrade to Enterprise for 60 seconds. Test:
curl -X POST https://<your-app>.vercel.app/api/logs -d '[{"test": "small"}]' -H "Content-Type: application/json"

Truncated Logs

  • Symptom: 0_message cuts off:
"0_message": "INFO: Received log data {\"preview\":\"[{\\\"id\\\":\\\"55729663910174181174125389600000\\\",\\\"message\\\":\\\"INFO: Request received...\",\"size\":5065}"
  • Diagnosis: Large payloads hit limits.
  • Fix: The function parses req.body correctly—check dataSize. Increase Vercel memory if needed (1769 MB allocated, 78 MB used—usually fine). Reduce log verbosity.

OpenObserve Connection Issues

  • Symptom: Logs show:
INFO: Error sending to OpenObserve {"error":"Request timed out"}
  • Diagnosis: Network or auth issues.
  • Fix: Test (step 1.1.4). Ensure OPENOBSERVE_URL and API_KEY match. Contact OpenObserve support if the API’s down.

Take Control of Your Vercel Logs with OpenObserve

You’ve built a robust Vercel log monitoring system with OpenObserve, using a secure serverless function to streamline the process. Now, you have deep visibility into your app’s runtime—function logs, edge details, and more—all in one place.

Want to learn more or have questions? If you have any questions or need help, please join our Slack community or reach out directly. We're here to help make your monitoring journey as smooth as possible!

About the Author

Nitya Timalsina

Nitya Timalsina

TwitterLinkedIn

Nitya is a Developer Advocate at OpenObserve, with a diverse background in software development, technical consulting, and organizational leadership. Nitya is passionate about open-source technology, accessibility, and sustainable innovation.

Latest From Our Blogs

View all posts