How to Secure Vercel Cron Job routes in Next.js 14 (app router)
Learn how to manage Cron Jobs effectively in Vercel. How to stay in the Hobby plan while executing multiple API requests. Learn how to manage Cron Jobs effectively in Vercel. Explore cron job duration, error handling, deployments, concurrency control, local execution and more.

Use Case
Protect your cron job routes to ensure they can't be accessed by anyone with the URL. We need to guarantee that only Vercel's servers can trigger these jobs. To achieve this, we'll use a CRON_SECRET in our environment variables. Vercel will automatically include this secret as a bearer token in the request headers, adding an extra layer of security. This way, you can rest easy knowing your cron jobs are safe from unauthorized access.
TLDR
The image below is the fastest setup possible for a cron job. In the additional details below I walk you through setting up a cron job and pulling YouTube statistics.
Basic Limits
There are a few key features to remember if you are running Next.js on the hobby plan of Vercel, duration, memory, and amount of cron jobs. I call attention to these two because setting up your cron job to do all the work will most likely lead to failure, in some minimal cases this might work for you, but ideally, you want something of a publish-subscribe method.
Max Duration
Max Duration the default for this is 10 seconds, while the maximum is 60 seconds.
Memory Limits
The limit on Vercel's Hobby plan is 1024 MB / 0.6 vCPU.
Cron Limit
The Hobby plan has a limit of 2 cron jobs, the Pro plan has a limit of 40. Remember this when architecting your cron solution. At the bottom of this post, I have examples of doing YouTube Data V3 API lookups for every video in CodingCat.dev's collection and reporting back that day's views.
Create Secure Endpoint
You can find more details about this in Vercel's Securing cron jobs.
Generate CRON_SECRET
Create a secret key using the below command or something like 1Password to generate a key that is at least 16 characters (in this example we use 32). This is just like a password so make sure not to check this into a git repository.
Copy this key and save it locally in .env.local assigned to CRON_SECRET variable like below.
Route Handler
In the below example, you will create a GET handler that will check for the Bearer authorization header. Create this file in app/api/cron/route.tsx.
Now you can test this function locally to see if you get a success message back by running the below command.
With this command, you should see that anyone trying to access your endpoint that runs the cron job will fail because they have not sent the correct authorization header like Vercel does when issuing the job. In the below screenshot, you will see where the endpoint returned a 401 Unauthorized.
Now issue the command passing the correct authorization header like below. Pay close attention to include Bearer before your CRON_SECRET.
As you can see in the result below you will now get a HTTP status of 200 OK.
Production
Now that you have tested your cron endpoint locally it is time to get ready for production. Vercel cron jobs will only run on your production deployment. I wish this wasn't true because I like testing off of our dev branch but I guess you can't win them all 😉.
Vercel Environment Variables
You can copy and paste your environment variable directly into your project settings. You can either leave this checked for all deployments or only check production since these will only run in production. The below screenshot shows the current way of saving this environment variable.
Update Vercel Project File
To set up a cron job you need to have a file located at .vercel/project.json that will include the path to call for the cron job and the schedule. In the below example, I have set my cron job to trigger every day at 5 pm (or 17:00). Try crontab guru to get the settings exactly how you want them.
Now you can push the change to your production branch so that the cron job will start running.
Next Steps
Now that you have a cron job setup you will need to typically create another dynamic endpoint that will allow you to call those services in a multithreaded fashion.
Below is the cron endpoint created just like above with one addition that it also triggers a new API call /api/youtube/views, passing the same CRON_SECRET so that the same route can be protected as well.
YouTube Statistics using Data API v3
CodingCat.dev has a Sanity.io backend so some updates are happening here that need some explanation, if you are not used to Sanity don't worry about it they are just like any other database that you might be using to store data.
Below is the full code and functions as a completely self-contained API. This means at any point I can manually trigger the API to update any YouTube statistics for a post that is associated. That means if my next podcast episode starts blowing up I can immediately get it on our front page as a top podcast, or trending podcast to watch. Then based on our cron schedule from above it will run every day at 5 pm so we never miss adding stats.
I would highly recommend writing all of your cron jobs like this so that they initiate another process to avoid the limitations in place as described above. Each invocation of the YouTube Data API and update to our sanity backend takes about 1 second.