Unverified Commit ae26e87b authored by dianajlailaty's avatar dianajlailaty Committed by GitHub
Browse files

Adding the Audit and Traceability to the Plotly dashboard (#778)



* Adding the Audit and Traceability to the Plotly dashboard

* Update PMLUserGuide.adoc

* Update PMLUserGuide.adoc

* Update PMLUserGuide.adoc

* Update PMLUserGuide.adoc

* Update PMLUserGuide.adoc
Co-authored-by: Mohamed Boussaa's avatarMohamed BOUSSAA <mohamed.boussaa@activeeon.com>
parent cbde3d65
......@@ -841,45 +841,28 @@ image::MAAS_ML_Delete_Service.PNG[align=center]
There is also one more action that can be executed from Service Automation Portal which is:
- *Update_MaaS_ML_Parameters*: This action enables you to update the variables values associated to the MaaS_ML instance according to your new preferences.
- *Update_MaaS_ML_Parameters*: This action enables you to update the variable values associated to the MaaS_ML instance according to your new preferences.
==== Audit and Traceability
To access the Audit and Traceability page, click on the endpoint under the Endpoint list.
In this page, you are able to check the different values of the model variables.
You can also track the traceability information of the token during several date/time(s) as shown in the below image.
image::traceability1.png[align=center]
==== MaaS_ML Data Analytics
If you click on the link "Click here for MaaS_ML data analytics" provided on the top of the Audit and
Traceability page, you will be redirected to MaaS_ML Data Analytics page which contains three tabs:
==== MaaS_ML Analytics
When the MaaS_ML service instance is running, the user is able to access the MaaS_ML Analytics page by clicking on the instance's endpoint which contains 4 tabs:
** Audit and Traceability
** Dataset Analytics
** Data Drift Analytics
** Predictions Preview
===== Audit and Traceability
When clicking on the endpoint, the user is redirected to a 4 tabs webpage. By default, the Audit and Traceability page is opened. In this page, the user can check the different chosen values of the MaaS_ML instance variables. In addition, the MaaS_ML traceability information and warnings are listed in a table where each row represents information about the initialization, deployment, prediction, etc. corresponding to different date/time(s). The figure below shows an overview of the Audit and Traceability tab.
image::traceability1.png[align=center]
===== Dataset Analytics
As MaaS_ML supports versioning, you are able to deploy multiple model versions for the same model type.
When deploying different model versions, you have the possibility to associate each version
with a subset of the data used to train the model i.e. the baseline data. The main job of the baseline data is to help in
detecting drifts in the future input datasets. Data drift detection is detailed in <<_data_drift_detection_ddd>> subsection.
Using the several baseline datasets, optionally, deployed with the different model versions, you are able to compare the changes
occurring from one model version to another, specifically regarding the datasets used to train them.
As shown in the figure below, using the three dropdowns on the top of this tab page, you can choose the model name, the feature (or column)
name you would like to monitor and the metric which is based on some data statistical functions (Mean, Minimum, Maximum, Variance, Standard Deviation).
By choosing these three values, the first graph
will show the evolution of the values (according to the chosen statistical function) of the chosen feature relative to
the different model versions. You also have the possibility to monitor multiple features at the same time by choosing multiple feature
names in the second dropdown. You can add
or remove any of the displayed graphical lines using the features dropdown. Details about the obtained values are displayed by hovering over
the markers on each graphical line.
If you click on one of these markers, a histogram will appear in the second graph of this tab page. The displayed
histogram shows a comparison of the probability density distributions of the data values of the selected feature
among all the deployed model version. By clicking on the content of the legend, you can include or exclude from the comparison any of the model
versions.
As MaaS_ML supports versioning, you are able to deploy multiple model versions for the same model type. When deploying different model versions, you have the possibility to associate each version with a subset of the data used to train the model i.e. the baseline data. The main job of the baseline data is to help in detecting drifts in the future input datasets. Data drift detection is detailed in <<_data_drift_detection_ddd>> subsection. Using the several baseline datasets, optionally, deployed with the different model versions, you are able to compare the changes occurring from one model version to another, specifically regarding the datasets used to train them.
As shown in the figure below, using the three dropdowns on the top of this tab page, you can choose the model name, the feature (or column) name you would like to monitor and the metric which is based on some data statistical functions (Mean, Minimum, Maximum, Variance, Standard Deviation). By choosing these three values, the first graph will show the evolution of the values (according to the chosen statistical function) of the chosen feature relative to the different model versions. You also have the possibility to monitor multiple features at the same time by choosing multiple feature names in the second dropdown. You can add or remove any of the displayed graphical lines using the features dropdown. Details about the obtained values are displayed by hovering over the markers on each graphical line.
If you click on one of these markers, a histogram will appear in the second graph of this tab page. The displayed histogram shows a comparison of the probability density distributions of the data values of the selected feature among all the deployed model version. By clicking on the content of the legend, you can include or exclude from the comparison any of the model versions.
image::data_analytics_tab.png[align=center]
......@@ -887,16 +870,13 @@ image::data_analytics_tab.png[align=center]
Coming soon!
===== Predictions Preview
When the user calls a deployed model of a specific version to obtain some predictions, he can choose to save the resulting predictions.
The saved predictions can be previewed in the Predictions Preview tab page. As shown in the figure below, you can choose the model name and the model
version using the dropdowns in the top of the page. According to your choices, the predictions dataframe will be previewed. The figure below shows an
example of the previewed predictions.
When the user calls a deployed model of a specific version to obtain some predictions, he can choose to save the resulting predictions. The saved predictions can be previewed in the Predictions Preview tab page. As shown in the figure below, you can choose the model name and the model version using the dropdowns in the top of the page. According to your choices, the predictions dataframe will be previewed. The figure below shows an example of the previewed predictions.
image::predictions_tab1.png[align=center]
=== MaaS_ML Via Swagger UI
To access the Swagger UI, click on the second link in the top of the Traceability & Audit page.
To access the Swagger UI, click on the button "GO TO SWAGGER UI" in the top of the Traceability & Audit tab in the MaaS_ML Analytics page.
Through this Swagger UI, you are now able to:
......
src/docs/images/traceability1.png

160 KB | W: | H:

src/docs/images/traceability1.png

151 KB | W: | H:

src/docs/images/traceability1.png
src/docs/images/traceability1.png
src/docs/images/traceability1.png
src/docs/images/traceability1.png
  • 2-up
  • Swipe
  • Onion skin
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment