- About The Project
- Overview
- Azure Services Used
- Project Structure
- Features
- Installation and Usage
- Credits
Additionally, the project displays the 10 most relevant climate change news articles per country on a monthly basis, classified into good or bad news via a custom-trained text-classification model.
The collected data is visualized on multiple 3D globes rendered on the website.
The colors on each component (either red, green or yellow) reflect the general conditions of each parameter.
As the project requires to draw data on monthly news headlines and climate statistics of various countries, we chose Azure CosmosDB for MongoDB to store this data in a NoSQL format in the cloud.
This data is queried by month every time a user requests it through the front end.
A screenshot of the CosmosDB data explorer, showing a sample document
We chose CosmosDB for MongoDB as we wanted real time fetching of data in a specific format(JSON) only.
To update the database with the complete statistics and headlines of the prior month, we leveraged the event driven nature of Azure functions, particularly the timer trigger.
We configured the data fetch functions to trigger on the 1st of every new month.
The main file of the Azure functions project, with the CRON timer trigger.
To estimate the overall sentiment of the monthly news headlines mined for each country, we utilised Custom Text Classification of the Azure AI Language Suite. This involved manually labelling 930+ headlines as "Good News" or "Bad News", and training a custom text multilingual classification model on this dataset through the Language Studio UI.
The model achieved an F1 score of 84.95%.
Performance of the model upon testing
We used the Azure Language REST API to programmatically classify headlines that were mined on the fly before storing it in the database.
We deployed the backend server on Azure Web Apps from Node.js. Our backend server consisted of two custom API endpoints that we created for querying the months data from the database, as well as fetching annual CO2 emissions of countries.
These API endpoints are live at https://sotwserver.azurewebsites.net/months/{month}/{year}
, and https://sotwserver.azurewebsites.net/co2
respectively.
Note: The month data was only mined for the year 2024.
The project is divided into three main directories:
backend/
: Contains the server-side code, including API routes, as well as Azure Functions.frontend/
: Contains the client-side code, including React components, assets and utility functions.shared/
: Contains shared resources, such as JSON data as well as common utils shared between the server and azure functions. These arenpm install
'd on their respective directories.
The tech stack for the project includes:
- MongoDB as the database hosted through Azure CosmosDB Node and Express server backend deployed to Azure Web Apps.
- Serverless Azure Functions that are triggered every month for updating the database.
- React on the front end using Three.js for 3D rendering and Tailwind for styling, served deployed through Vercel.
- Monthly temparature anomalies of countries are calculated from data obtained from the OpenWeatherMap historical and accumulated parameters API.
- Monthly average of CO, NO2, SO2 and O3 emissions of countries are calculated from the OpenWeather air pollution API.
- Climate change news for each month is fetched via scraping advanced Google search results.
- Classification of news headline done by a custom trained text classification Machine Learning algorithm implemented in the Azure AI Language Studio.
- This data is displayed intuitively on multiple globes we have devised for this project.
- The data is updated automatically every month.
- Additionally, the serverside code caches data received from the database for an hour, to prevent unnecessary database queries.
- On the client side, the browser is set to cache data received for 12 hours.
The website can be accessed at http://stateoftheworld.vercel.app/
The data can be accessed at https://sotwserver.azurewebsites.net/months/ ... followed by the required month and year
You can also build the website from source locally:
- Clone the repository.
- Install the dependencies in the root directory, frontend, and backend:
npm install
cd frontend && npm install
cd ../backend && npm install
To start the frontend development server:
cd frontend && npm run dev
To start the backend server:
cd backend && node index.js
"State of the World" was ideated, designed and developed by
Dharmisht SVK [dragn0id]
- Tested the project throughout every stage of development
- Prepared Codebase for hosting
- Hosted Node.js server on Azure App Services
- Deployed the web app to Vercel
Dhyaan Kotian[Dhyaan1]
- Developed the Website frontend UI and responsive design
- Developed the Website functionality to work smoothly with data fetched from Azure CosmoDB
- Trained and deployed the Azure Language AI custom classification model and created the API endpoints for it
Hrishik Sai [fringewidth]
- Developed the main backend functionality, including data fetching through Azure functions, custom API endpoints, and the CosmosDB.
- Implemented all globes using Three.js
- Designed the Website UI