A web application that analyses real-time data such as news, twitter tweets regarding particular stocks alongside it’s previous history to produce a consolidated prediction with impressive accuracy.
I. Input Manager
The system's Input management module is responsible for providing correct and compatible input feature sets to machine learning models in order to generate predictions.
It consists of two sub-modules:
- Data Filter Unit
- Data Processing Unit
The data filter is responsible for filtering out unnecessary attributes from the input dataset and narrowing down the input dataset to the required attribute sets for further preprocessing activities.
The data processing unit is responsible for various preprocessing activities such as data cleaning, normalization, data reshaping suitable for models, processing text data to sentiment values for sentiment-based models, and other necessary data processing.
The historical data model is the one making use of LSTM networks and takes in the feature set with one feature "Closing price" of the shape (1,60,1), which is an array of closing price values of the past 60 days. The feature set is fed to the model, and it generates the prediction for the next day.
III. Sentiment Based Model
The sentiment-based model employs LSTM networks and includes two features: "Closing price and sentiments" of the shape (1,3,2), which is an array of closing price and sentiment values from the previous three days. The sentiment values must be generated by utilising the Vader-Sentiment analyzer to process tweets and news feeds over the previous three days. The model is fed the feature set, and it forecasts the next day's closing price.
IV. Output
The output comprises predictions from historical data and sentiment-based models. The output predictions generated from historical data are based on statistical forecasts based on the previous closing prices, whereas the output predictions of the sentiment-based model incorporate drastic changes which might occur based on public sentiments.
-
Recommend creating a virtual environment with python 3.8.10 using the command:
virtualenv -p <my_env_name>
-
Activate the virtualenv:
source <my_env_name>/bin/activate
-
Use pip/pip3 and requirements.txt to install required packages:
pip install -r requirements.txt
-
Manually install other missing packages using pip command:
pip install
-
Edit app.py and update news-api credentials(api-key)(generate at https://newsapi.org/)
-
Run the streamlit webserver using the commmand:
streamlit run app.py
(Download if unable to render code block)