- Update config.yaml
- Update secrets.yaml [optional]
- Update params.yaml
- update the entity
- Update the configuration managerin src config
- Update the components
- Update the pipeline
- Test run pipeline stage
- run tox for testing your package
- Update the dvc.yaml
- run "dvc reproduce" for running all the stages in pipelie
pytest
Testing Mlflow: run research/mlexample.py run mlflow ui INFO:waitress:Serving on http://127.0.0.1:5000
Mlflow server:
mlflow server
--backend-store-uri sqlite:///mlflow.db
--default-artifact-root ./artifacts
--host 0.0.0.0 -p 1234
for windows system:
in browser localhost:1234
dvc-git-github-dags-mlflow
STEP 1: Set remote URI in the python <script.py> remote_server_uri = "https://dagshub.com/ArunKhare/DEEPCNNClassifier.mlflow" mlflow.set_tracking_uri(remote_server_uri)
STEP 2: Set the env variable | Get it from dagshub -> Remote tab -> mlflow tab
export MLFLOW_TRACKING_URI=https://dagshub.com/ArunKhare/DEEPCNNClassifier.mlflow
export MLFLOW_TRACKING_USERNAME=ArunKhare
export MLFLOW_TRACKING_PASSWORD=<>
python <script.py>
STEP 3: https://dagshub.com/ArunKhare/DEEPCNNClassifier -> Remote -> MLflow -> Go to mlflow ui Use context manager of mlflow to start run and then log metrics, params and model
Prediction Service: start Docker Desktop in the system
docker build -t prediction_service . docker run prediction_service docker run -p 8501:8501 prediction_service # port map the container port(host) to windows port streamlit run app.py
Tag the image: docker tag pred_service1 arunkhare/pred_service1 docker tag firstimage YOUR_DOCKERHUB_NAME/firstimage
docker push arunkhare/pred_service1:tagname
local run: cd prediction_service copy model.h5 to prediction_service streamlit run app.py
fastapi (API + UI) rm -rf rm ~./condaarc DEPLOYMENT:
-
Login to AWS console.
-
Create IAM user for deployment
with specific access
-
EC2 access : It is virtual machine
-
ECR: Elastic Container registry To save your docker image in aws
Description: About the deployment
- Build docker image of the source code
- Push your docker image to ECR
- Launch Your EC2
- Pull Your image from ECR in EC2
- Lauch your docker image in EC2
Policy:
- AmazonEC2ContainerRegistryFullAccess
- AmazonEC2FullAccess
-
-
Create ECR repo to store/save docker image
- Save the URI: 315865595366.dkr.ecr.us-east-1.amazonaws.com/simple-app
-
Create EC2 machine (Ubuntu)
-
Open EC2 and Install docker in EC2 Machine:
#optinal sudo apt-get update -y sudo apt-get upgrade
#required curl -fsSL https://get.docker.com -o get-docker.sh sudo sh get-docker.sh sudo usermod -aG docker ubuntu newgrp docker
-
Configure EC2 as self-hosted runner: setting>actions>runner>new self hosted runner> choose os> then run command one by one
-
Setup github secrets:
AWS_ACCESS_KEY_ID=
AWS_SECRET_ACCESS_KEY=
AWS_REGION = us-east-1
AWS_ECR_LOGIN_URI = demo>> 566373416292.dkr.ecr.ap-south-1.amazonaws.com
ECR_REPOSITORY_NAME = simple-app
-
Confiure secuity . add inbound rule port 8080