Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

network error #516

Open
Reginald-L opened this issue Apr 29, 2024 · 4 comments
Open

network error #516

Reginald-L opened this issue Apr 29, 2024 · 4 comments

Comments

@Reginald-L
Copy link

Hi, thanks very much for your work. when I run the code base Development rules, got a network error, may I get your help?

  1. npm install --- under web_app
  2. add a .env.local file --- VITE_BACKEND=http://127.0.0.1:9999
  3. npm run dev ---- http://localhost:5173/
  4. error message on console
    image
    error1
@Sanster
Copy link
Owner

Sanster commented Apr 29, 2024

Front-end development requires starting the corresponding backend service.

pip install -r requirements.txt
python3 main.py --model lama --port 9999

@Reginald-L
Copy link
Author

Reginald-L commented Apr 29, 2024

Front-end development requires starting the corresponding backend service.

pip install -r requirements.txt
python3 main.py --model lama --port 9999

Thanks very much for your response, I tried these instructions, but the problem is still remain. It seems that the end program is not running.
I tried running 'python3 main.py --model lama --port 9999' both before and after npm run dev
image

more interesting thing is: when I run 'pip install iopaint' and 'iopaint start --model lama --port 9999', then localhost:5173 network error disappears.

@Sanster
Copy link
Owner

Sanster commented Apr 29, 2024

Executing python3 main.py --model lama --port 9999 and executing iopaint start --model lama --port 9999 should be the same. It looks like your command line didn't print any information when executing python3 main.py, I'm not sure what the issue is.

If you do not need to modify the backend code, executing iopaint start --model lama --port 9999 indeed works as well.

@Reginald-L
Copy link
Author

Executing python3 main.py --model lama --port 9999 and executing iopaint start --model lama --port 9999 should be the same. It looks like your command line didn't print any information when executing python3 main.py, I'm not sure what the issue is.

If you do not need to modify the backend code, executing iopaint start --model lama --port 9999 indeed works as well.

cool, thanks very much for your help and perfect work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants