Open-Source Job Application Search Engine
- VectorDB & Embedding Models: Upstash Vector
- Scheduling & Serverless Function Orchestration: Upstash QStash
- App logic: Next.js
- Deployment: Vercel
- File Storage: uploadthing
- LLM: OpenAI
- UI Components: shadcn
Step 1: Clone the repository
- Clone the repository:
git clone https://github.com/upstash/purple-squirrel.git cd purple-squirrel
- Create a
.env
file in the root directory, and copy the contents of.env.local.example
into it. - Fill the environment variables as described in the next steps.
Step 2: Connect your mailbox
Note: This tutorial will be based on Gmail, but you can set up an IMAP connection with any other provider. We recommend creating a separate email like ps@company.com and forwarding job mails there. You can also create a folder like JOBS and configure the application to read from that folder in the setup step.
- Complete the following steps described in this tutorial.
- Fill the following environment variables in your
.env
file:- IMAP_USERNAME: Your mail address
- IMAP_PASSWORD: App Password you generated
- IMAP_HOST: imap.gmail.com
- IMAP_PORT: 993
Step 3: Set up Upstash Vector
- Open an Upstash account.
- Switch to Vector tab in Console.
- Click Create Index.
- Think of a name and select a region close to your users, Embedding Model, Dimensions and Metric should be set like below.
- Click Next -> Click Create.
- Fill the following environment variables in your
.env
file, which can be found and copied in your index page:
Step 4: Set up Upstash QStash
- Switch to QStash tab in Upstash Console
- Fill the following environment variables in your
.env
file, which can be found and copied in your QStash page:
QStash free plan has a limit of 500 messages per day. This will limit your mail pipeline to approximately 200 applicants per day. We recommend upgrading to the pay as you go plan. See QStash Pricing for more information.
Step 5: Set up uploadthing
- Sign in to uploadthing.
- Click Create a new app.
- Think of a name and select an app default region close to your users.
- Fill the following environment variables in your
.env
file, which can be found and copied in the API Keys tab:
Step 6: Set OpenAI API key
- Go to OpenAI Platform -> API keys and login to your account.
- Click Create new secret key.
- Enter a name and click Create secret key.
- Don't forget to copy and save your key. Fill the following environment variable in your
.env
file:- OPENAI_API_KEY
Step 7: Set up Basic Auth
- Decide if you want Basic Auth in your application, and set the following environment variable in your
.env
file:
BASIC_AUTH_ENABLED=true
or
BASIC_AUTH_ENABLED=false - If you want Basic Auth, fill the following environment variables in your
.env
file:- BASIC_AUTH_USERNAME
- BASIC_AUTH_PASSWORD
Step 8: Deploy & Setup
- Deploy your application to Vercel with the following command:
vercel
- Go to your project at Vercel Dashboard for the next steps.
- Learn the Production Domain of your application from the Project tab.
- Go to Settings -> Environment Variables, copy and paste your
.env
file. - Fill the following environment variable with the Production Domain (Not the Deployment URL) of your application:
- NEXT_PUBLIC_URL: Production Domain of your application (e.g. https://your-app.vercel.app)
- Go to the Deployments tab and redeploy your application.
- Visit
https://your-app.vercel.app/setup
to set up your application.
→ Your application is ready to use!
A local tunnel is required in local development since QStash requires a publicly available API to send messages to. This tutorial is based on localtunnel.me but you can use any service of your choice.
Step 1: Create a local tunnel
npx localtunnel --port 3000
Step 2: Fill environment variables
Copy the output URL and fill the following environment variable in .env.local
LOCAL_TUNNEL_URL=<YOUR_URL>
Fill the rest of the environment variables in .env.local
as described in the Deploy your own section.
Step 3: Install dependencies & run the project
npm install
npm run dev
→ Your application is ready to use!
In local development, mail pipeline is triggered only once instead of creating a schedule since local server is not expected to be always available.
We welcome contributions to improve this project. Please feel free to submit issues or pull requests.