Hi, My colleague has set up fresh zenml server . He tried to execute the quickstart example. Everything went well and pipeline is successful. But mlflow ui is not accessible. Getting URL not found. Any idea? netstat output -------netstat -an | grep 8000 tcp4 0 0 127.0.0.1.8000 127.0.0.1.51359 ESTABLISHED tcp4 0 0 127.0.0.1.51359 127.0.0.1.8000 ESTABLISHED tcp4 0 0 127.0.0.1.8000 *.* LISTEN tcp4 0 0 127.0.0.1.51355 127.0.0.1.8000 TIME_WAIT tcp4 0 0 127.0.0.1.51356 127.0.0.1.8000 TIME_WAIT
Last active 8 days ago
16 replies
0 views
- NE
Hi, My colleague has set up fresh zenml server . He tried to execute the quickstart example. Everything went well and pipeline is successful. But mlflow ui is not accessible. Getting URL not found. Any idea? netstat output -------netstat -an | grep 8000
tcp4 0 0 127.0.0.1.8000 127.0.0.1.51359 ESTABLISHED
tcp4 0 0 127.0.0.1.51359 127.0.0.1.8000 ESTABLISHED
tcp4 0 0 127.0.0.1.8000 . LISTEN
tcp4 0 0 127.0.0.1.51355 127.0.0.1.8000 TIMEWAIT tcp4 0 0 127.0.0.1.51356 127.0.0.1.8000 TIMEWAIT - AL
how about ?
- NE
Method Not allowed
- AL
Alright, so it does seem like the service for the mlflow model deployer did start up. To the best of my knowledge this specific endpoint it meant for the mlflow deployer to deploy its models to, so it would not be an endpoint that allows you to access it via a get (what your browser is doing here).
What you are looking for is the mlflow tracking dashboard?
- NE
To view the models and versions. Its registry.
- NE
how do we validate that my model is deployed in mlflow?
- SA
Hey @nethrasms Are you trying to access mlflow UI or just to verify if the model has been deployed?
- NE
as per pipeline its shows model is deployed. SO wanted to verify by seeing the mlflow UI
- NE
In previous thread, they are able to see the mlflow UI right?
- ST
@nethrasms yes, we somehow forgot to add this to the example, but you can also do it. Just add the following lines to your run.py:
``` from zenml.integrations.mlflow.mlflowutils import gettracking_uri
from rich import printprint( "You can run:\n " f"[italic green] mlflow ui --backend-store-uri '{get_tracking_uri()}" "[/italic green]\n ...to inspect your experiment runs and models " "within the MLflow UI.\nYou can find your runs tracked within the " "`training_pipeline` and `continuous_deployment_pipeline` experiments. " "There you'll also be able to compare two or more runs and view the " "registered models.\n\n" )```
- SA
Ah so MLflow actually won’t show you the deployed model, it will only show you the expirements and registered models, however we have commands in our CLI that you can actually use to see if the model is still running in the background and can be used to run prediction on.
- To access the MLflow UI, you can run the following:
```# Get the Mlflow UI
from zenml.integrations.mlflow.mlflowutils import gettrackinguri print(gettracking_uri())
Run in your terminal
mlflow ui --backend-store-uri="<TRACKINGUIPATH>"```
- You can use the ZenML CLI to see deployed models by running the following:
```# To get list of models
zenml model-deployer models list
get detailed informations about the deployed model by running
zenml model-deployer models describe <IDOFTHE_MODEL>```
- To access the MLflow UI, you can run the following:
- ST
heh :slightlysmilingface: even better
- NE
Thanks. It shows up :slightlysmilingface:
- ST
sweet ! :slightlysmilingface:
- NE
Reminder to update the code in example please :slightlysmilingface:
- ST
will be in the next release, of course :slightlysmilingface:
Last active 8 days ago
16 replies
0 views