Unverified Commit 6cb8b8e2 authored by Andrews Cordolino Sobral's avatar Andrews Cordolino Sobral Committed by GitHub
Browse files

PML to PAIO (#823)

* renamed related folders and files

* pml > paio

* PML  > PAIO

* Proactive Machine Learning > ProActive AI Orchestration
parent 9e9c862b
......@@ -32,7 +32,7 @@ asciidoctor {
sources {
include 'user/ProActiveUserGuide.adoc'
include 'JobPlanner/JobPlannerUserGuide.adoc'
include 'PML/PMLUserGuide.adoc'
include 'PAIO/PAIOUserGuide.adoc'
include 'PEO/PEOUserGuide.adoc'
include 'PSA/PSAUserGuide.adoc'
include 'admin/ProActiveAdminGuide.adoc'
......
......@@ -5,7 +5,7 @@
image::architecture.png[align=center]
On the top left there is the *Studio* interface which allows you to build Workflows.
It can be interactively configured to address specific domains, for instance Finance, Big Data, IoT, Artificial Intelligence (AI). See for instance the Documentation of *ProActive Machine Learning* link:../PML/PMLUserGuide.html[here^], and try it online https://try.activeeon.com/studio/#workflows/templates/machine-learning[here^]. In the middle there is the *Scheduler* which enables an enterprise to orchestrate and automate Multi-users, Multi-application Jobs.
It can be interactively configured to address specific domains, for instance Finance, Big Data, IoT, Artificial Intelligence (AI). See for instance the Documentation of *ProActive AI Orchestration* link:../PAIO/PAIOUserGuide.html[here^], and try it online https://try.activeeon.com/studio/#workflows/templates/machine-learning[here^]. In the middle there is the *Scheduler* which enables an enterprise to orchestrate and automate Multi-users, Multi-application Jobs.
Finally, at the bottom right is the *Resource manager* interface which manages and automates resource provisioning
on any Public Cloud, on any virtualization software, on any container system, and on any Physical Machine of any OS.
All the components you see come with fully Open and modern REST APIs.
......
PAIOUserGuide.html
\ No newline at end of file
<div id="footer-text-paio">
Version <span id="versionIdFooterPAIO"/>
</div>
<script>
document.getElementById('versionIdFooterPAIO').innerHTML = conf.version
</script>
:docinfo:
:toc:
:toc-title: PML User Guide
= ProActive Machine Learning
:toc-title: PAIO User Guide
= ProActive AI Orchestration
include::../common-settings.adoc[]
include::../all-doc-links.adoc[]
== Overview
=== What is ProActive Machine Learning (PML)?
=== What is ProActive AI Orchestration (PAIO)?
include::references/Overview.adoc[]
......@@ -55,11 +55,11 @@ Suppose you need to predict houses prices based on this information (features) p
- *LSTAT* % lower status of the population
- *MDEV* Median value of owner-occupied homes in $1000' s
Predicting houses prices is a complex problem, but we can simplify it a bit for this step-by-step example. We'll show you how you can easily create a predictive analytics solution using PML.
Predicting houses prices is a complex problem, but we can simplify it a bit for this step-by-step example. We'll show you how you can easily create a predictive analytics solution using PAIO.
=== Manage the Canvas
To use PML, you need to add the *Machine Learning Bucket* as main catalog in the ProActive Studio. This bucket contains a set of generic tasks that enables you to upload and prepare data, train a model and test it.
To use PAIO, you need to add the *Machine Learning Bucket* as main catalog in the ProActive Studio. This bucket contains a set of generic tasks that enables you to upload and prepare data, train a model and test it.
A. Open +++<a class="studioUrl" href="/studio" target="_blank">ProActive Workflow Studio</a>+++ home page.
......@@ -83,7 +83,7 @@ A. Once dataset has been converted to *CSV* format, upload it into a cloud stora
For this tutorial, we will use Boston house prices dataset available on this link:
https://s3.eu-west-2.amazonaws.com/activeeon-public/datasets/boston-houses-prices.csv
B. Drag and drop the <<Import_Data>> task from the *machine-learning* bucket in the ProActive Machine Learning.
B. Drag and drop the <<Import_Data>> task from the *machine-learning* bucket in the ProActive AI Orchestration.
C. Click on the task and click `General Parameters` in the left to change the default parameters of this task.
......@@ -113,14 +113,14 @@ image::prepare_data.gif[100000,2000]
=== Train a Predictive Model
Using PML, you can easily create different ML models in a single experiment and compare their results. This type of experimentation helps you find the best solution for your problem.
Using PAIO, you can easily create different ML models in a single experiment and compare their results. This type of experimentation helps you find the best solution for your problem.
You can also enrich the `machine-learning` bucket by adding new ML algorithms and publish or customize an existing task according to your requirements as the tasks are open source.
NOTE: To change the code of a task click on it and click the `Task Implementation`. You can also add new variables to a specific task.
In this step, we will create two different types of models and then compare their scores to decide which algorithm is most suitable to our problem. As the Boston dataset used for this example consists of predicting price of houses (continuous label). As such, we need to deal with a regression predictive problem.
To solve this problem, we have to choose a regression algorithm to train the predictive model. To see the available regression algorithms available on the PML, see *ML Regression* Section in the *machine-learning* bucket.
To solve this problem, we have to choose a regression algorithm to train the predictive model. To see the available regression algorithms available on the PAIO, see *ML Regression* Section in the *machine-learning* bucket.
For this example, we will use <<Linear_Regression>> Task and <<Support_Vector_Regression>> Task.
......@@ -152,7 +152,7 @@ F. Connect each <<Preview_Results>> Task with <<Predict_Model>>.
image::test_the_predictive_model.png[100000,2000]
NOTE: if you have a pickled file (.pkl) containing a predictive model that you have learned using another platform, and you need to test it in the PML, you can load it using *Import_Model* Task.
NOTE: if you have a pickled file (.pkl) containing a predictive model that you have learned using another platform, and you need to test it in the PAIO, you can load it using *Import_Model* Task.
=== Run the Experiment and Preview the Results
......@@ -176,7 +176,7 @@ image::execute.gif[100000,2000]
The `auto-ml-optimization` bucket contains the `Distributed_Auto_ML` workflow that can be easily used to find the operating parameters for any system whose performance can be measured as a function of adjustable parameters.
It is an estimator that minimizes the posterior expected value of a loss function.
This bucket also comes with a set of workflows' examples that demonstrates how we can optimize mathematical functions, PML workflows and machine/deep learning algorithms from scripts using AutoML tuners.
This bucket also comes with a set of workflows' examples that demonstrates how we can optimize mathematical functions, PAIO workflows and machine/deep learning algorithms from scripts using AutoML tuners.
In the following subsections, several tables represent the main variables that characterize the AutoML workflows.
In addition to the variables mentioned below, there is a set of generic variables that are common between all workflows
which can be found in the subsection <<AI Workflows Common Variables>>.
......@@ -392,7 +392,7 @@ The following workflows have common variables with the above illustrated workflo
*CIFAR_100_Image_Classification:* trains a simple deep CNN on the CIFAR100 images dataset using the Keras library.
*Image_Object_Detection:* trains a YOLO model on the coco dataset using PML deep learning generic tasks.
*Image_Object_Detection:* trains a YOLO model on the coco dataset using PAIO deep learning generic tasks.
*Digits_Classification:* python script illustrating an example of multiple machine learning models optimization.
......@@ -472,7 +472,7 @@ References:
Once a predictive model is built, tested and validated, you can easily use it in real world production pipelines by deploying it as a REST Web Service via the MaaS_ML service.
MaaS_ML is dedicated to make deployments of lightweight machine learning (ML) models simple, portable, and scalable, and to easily manage their lifetimes. This will be particularly useful for engineering or business teams that want to take advantage of this model.
The life cycle of any MaaS_ML instance (i.e., from starting the generic service instance, deploying an AI specific model to pausing or deleting the instance) can be managed in three different ways in PML :
The life cycle of any MaaS_ML instance (i.e., from starting the generic service instance, deploying an AI specific model to pausing or deleting the instance) can be managed in three different ways in PAIO :
- Using the *Studio Portal* and more specifically the bucket *model-as-a-service* where specific generic tasks are provided to process all the possible actions (i.e., MaaS_ML_Service_Start, MaaS_ML_Deploy_Model, MaaS_ML_Call_Prediction, MaaS_ML_Actions[Finish/Pause/Resume]).
These tasks can be easily integrated to your AI pipelines/workflows as you can see in this <<Deployment Pipeline Example>>.
......@@ -492,7 +492,7 @@ you can also trigger the deployment of a specific VM using the Resource Manager
In the following subsections, we will illustrate the MaaS_ML instance life cycle, from starting the generic service instance,
deploying a specific model, pausing it, to deleting the instance. We will also describe how the MaaS_ML instance life cycle
can be managed via four different ways in PML:
can be managed via four different ways in PAIO:
. <<MaaS_ML Via Workflow Execution Portal>>
. <<MaaS_ML Via Studio Portal>>
......@@ -1083,7 +1083,7 @@ In case a data drift has occurred, a user will receive a notification using the
MaaS_DL is a model deployment service for putting AI models to production. MaaS_DL comes with new capabilities, compared to MaaS_ML, enabling users to deploy deep learning models, to update the deployed models with an updated version and to easily rollback to any previous version(s).
It provides out-of-the-box integration with TensorFlow Serving TFX taking advantage of its flexibility and high-performance serving system.
The life cycle of any MaaS_DL instance (i.e., from starting the generic service instance, deploying an AI specific model to pausing or deleting the instance) can be managed in three different ways in PML :
The life cycle of any MaaS_DL instance (i.e., from starting the generic service instance, deploying an AI specific model to pausing or deleting the instance) can be managed in three different ways in PAIO :
- Using the *Studio Portal* and more specifically the bucket *model-as-a-service* where specific generic tasks are provided to process all the possible actions (i.e., MaaS_DL_Service_Start, MaaS_DL_Deploy_Model, MaaS_DL_Actions[Finish/Pause/Resume], MaaS_DL_Undeploy_Model).
These tasks can be easily integrated to your AI pipelines/workflows as you can see in this <<Deployment Pipeline Example>>.
......@@ -1094,7 +1094,7 @@ Using MaaS_DL, you can easily deploy and use any machine or deep learning model
you can also trigger the deployment of a specific VM using the Resource Manager elastic policies, and eventually, deploy a Model-Service on that specific node.
In the following subsections, we will describe the MaaS_DL instance life cycle, from starting the generic service instance,
deploying a specific model, undeploying it, to deleting the instance. We will also describe how the MaaS_DL instance life cycle can be managed via four different ways in PML:
deploying a specific model, undeploying it, to deleting the instance. We will also describe how the MaaS_DL instance life cycle can be managed via four different ways in PAIO:
. <<MaaS_DL Via Workflow Execution Portal>>
. <<MaaS_DL Via Studio Portal>>
......@@ -1590,7 +1590,7 @@ To access the AutoFeat page, please follow the steps below:
. Create a new workflow.
. Drag and drop the `Import_Data_And_Automate_Feature_Engineering` task from the *machine-learning* bucket in the ProActive Machine Learning.
. Drag and drop the `Import_Data_And_Automate_Feature_Engineering` task from the *machine-learning* bucket in the ProActive AI Orchestration.
. Click on the task and click `General Parameters` in the left to change the default parameters of this task.
......@@ -1695,7 +1695,7 @@ It offers several functionalities, including:
- Charts to track variables and results evolution and correlation.
- Data exportation in multiple formats for further use in analytics tools.
ProActive Analytics is very useful to compare metrics and charts of workflows that have common variables and results. For example, a ML algorithm might take different variables values and produce multiple results. It would be interesting to analyze the correlation and evolution of the algorithm results regarding the input variation (See also a similar example of link:../PML/PMLUserGuide.html#_AutoML[AutoML]).
ProActive Analytics is very useful to compare metrics and charts of workflows that have common variables and results. For example, a ML algorithm might take different variables values and produce multiple results. It would be interesting to analyze the correlation and evolution of the algorithm results regarding the input variation (See also a similar example of link:../PAIO/PAIOUserGuide.html#_AutoML[AutoML]).
The following sections will show you some key features of the dashboard and how to use them for a better understanding of your job executions.
[[_job_search]]
......@@ -2502,11 +2502,11 @@ NOTE: The workflow represented in the above is available on the 'machine-learnin
== ML Workflows Examples
The PML provides a fast, easy and practical way to execute different workflows using the ML bucket. We present useful ML workflows for different applications in the following subsections.
The PAIO provides a fast, easy and practical way to execute different workflows using the ML bucket. We present useful ML workflows for different applications in the following subsections.
To test these workflows, you need to add the *machine-Learning-workflows Bucket* as main catalog in the ProActive Studio.
A. Open +++<a class="studioUrl" href="/studio" target="_blank">ProActive Machine Learning</a>+++ home page.
A. Open +++<a class="studioUrl" href="/studio" target="_blank">ProActive AI Orchestration</a>+++ home page.
B. Create a new workflow.
......@@ -2611,13 +2611,13 @@ Please find in the table below the list of algorithms which have GPU support and
== Deep Learning Workflows Examples
PML provides a fast, easy and practical way to execute deep learning workflows. In the following subsections, we present useful deep learning workflows for text and image classification and generation.
PAIO provides a fast, easy and practical way to execute deep learning workflows. In the following subsections, we present useful deep learning workflows for text and image classification and generation.
video::FwMPR87wzoo[youtube, width=700, height=400 start=0, position=center]
You can test these workflows by following these steps:
A. Open +++<a class="studioUrl" href="/studio" target="_blank">ProActive Machine Learning</a>+++ home page.
A. Open +++<a class="studioUrl" href="/studio" target="_blank">ProActive AI Orchestration</a>+++ home page.
B. Create a new workflow.
......@@ -2729,7 +2729,7 @@ WARNING: It is recommended to use an enabled-GPU node to run the deep learning t
=== AI Workflows Common Variables
In the following table, you can find the variables that are common between
most of the available AI workflows in PML associated to their descriptions.
most of the available AI workflows in PAIO associated to their descriptions.
[cols="2,5,2"]
|===
......
This diff is collapsed.
......@@ -34,7 +34,7 @@ Scheduler Web Interface:: ProActive component that provides a web interface to t
Workflow Studio:: ProActive component that provides a web interface for designing <<_glossary_workflow,*Workflows*>>.
[[_glossary_machine_learning_open_studio]]
Proactive Machine Learning:: PML component that provides a web interface for designing and composing ML <<_glossary_workflow,*Workflows*>> with drag and drop.
ProActive AI Orchestration:: PAIO component that provides a web interface for designing and composing ML <<_glossary_workflow,*Workflows*>> with drag and drop.
[[_glossary_job_planner_portal]]
Job Planner Portal:: ProActive component that provides a web interface for planning <<_glossary_workflow,*Workflows*>>, and creating <<_glossary_workflow,*Calendar Definitions*>>
......
*Proactive Machine Learning (PML)* is a complete DSML platform (Data Science and Machine Learning) including a ML Studio, AutoML, Data Science Orchestration and MLOps for the deployment,
*ProActive AI Orchestration (PAIO)* is a complete DSML platform (Data Science and Machine Learning) including a ML Studio, AutoML, Data Science Orchestration and MLOps for the deployment,
training, execution and scalability of artificial intelligence and machine learning models on
any type of infrastructure. Created for data scientists and ML engineers, the solution is simple to use and accelerate the development and deployment of machine learning models.
image::PML_overview.PNG[align=center]
Proactive Machine Learning platform provides a rich catalog of generic machine learning tasks that can be connected together to build either basic or advanced machine learning workflows for various use cases such as: fraud detection, text analysis, online offer recommendations, prediction of equipment failures, facial expression analysis, etc.
PML workflows enable users to manage machine learning pipelines through the different phases of the development lifecycle and allow them to better control tasks parallelization, by running the tasks on resources matching constraints (Multi-CPU, GPU, FPGA, data locality, libraries, etc).
image::PAIO_overview.PNG[align=center]
ProActive AI Orchestration platform provides a rich catalog of generic machine learning tasks that can be connected together to build either basic or advanced machine learning workflows for various use cases such as: fraud detection, text analysis, online offer recommendations, prediction of equipment failures, facial expression analysis, etc.
PAIO workflows enable users to manage machine learning pipelines through the different phases of the development lifecycle and allow them to better control tasks parallelization, by running the tasks on resources matching constraints (Multi-CPU, GPU, FPGA, data locality, libraries, etc).
image::PML-Open-Studio-ActiveEon.PNG[align=center]
image::PAIO-Open-Studio-ActiveEon.PNG[align=center]
The Proactive Machine Learning platform is an open source solution, and it can be tested online without installation on our try platforms https://try.activeeon.com/studio/#workflows/templates/machine-learning[here^].
The ProActive AI Orchestration platform is an open source solution, and it can be tested online without installation on our try platforms https://try.activeeon.com/studio/#workflows/templates/machine-learning[here^].
PMLUserGuide.html
\ No newline at end of file
<div id="footer-text-pml">
Version <span id="versionIdFooterPML"/>
</div>
<script>
document.getElementById('versionIdFooterPML').innerHTML = conf.version
</script>
......@@ -93,7 +93,7 @@ It serves as a generic template that can be used to create and start any docker
| options like environment variables, etc
| No
| String
| e.g. `-e POSTGRES_USER=pml -e POSTGRES_PASSWORD=proactive -e POSTGRES_DB=activeeon`
| e.g. `-e POSTGRES_USER=paio -e POSTGRES_PASSWORD=proactive -e POSTGRES_DB=activeeon`
|`DOCKER_COMMAND`
| Password for the root user
| No
......@@ -681,7 +681,7 @@ The service is started using the following variable.
=== Model as a Service for Machine Learning
This service allows the deployment of an instance of MaaS_ML through ProActive Service Automation (PSA) Portal.
NOTE: More details about the related actions of this service can be found in the link:/doc/PML/PMLUserGuide.html#_via_service_automation_portal[PML Doc].
NOTE: More details about the related actions of this service can be found in the link:/doc/PAIO/PAIOUserGuide.html#_via_service_automation_portal[PAIO Doc].
The service is started using the following variables.
......@@ -748,7 +748,7 @@ The service is started using the following variables.
| Boolean
| `false`
| `PYTHON_ENTRYPOINT`
| This entry script starts the service and defines the different functions to deploy the model, scores the prediction requests based on the deployed model, and returns the results. This script is specific to your model. This file should be stored in the Catalog under the `model_as_service_resources` bucket. More information about this file can be found in the link:../PML/PMLUserGuide.html#_customize_the_service[Customize the Service] section.
| This entry script starts the service and defines the different functions to deploy the model, scores the prediction requests based on the deployed model, and returns the results. This script is specific to your model. This file should be stored in the Catalog under the `model_as_service_resources` bucket. More information about this file can be found in the link:../PAIO/PAIOUserGuide.html#_customize_the_service[Customize the Service] section.
| Yes
| String
| `ml_service`
......@@ -768,7 +768,7 @@ The service is started using the following variables.
| Boolean
| `true`
| `YAML_FILE`
| A YAML file that describes the OpenAPI Specification ver. 2 (known as Swagger Spec) of the service. This file should be stored in the catalog under the `model_as_service_resources` bucket. More information about the structure of this file can be found in the section link:../PML/PMLUserGuide.html#_customize_the_service[Customize the Service].
| A YAML file that describes the OpenAPI Specification ver. 2 (known as Swagger Spec) of the service. This file should be stored in the catalog under the `model_as_service_resources` bucket. More information about the structure of this file can be found in the section link:../PAIO/PAIOUserGuide.html#_customize_the_service[Customize the Service].
| Yes
| String
| `ml_service-api`
......@@ -788,7 +788,7 @@ The service is started using the following variables.
=== Model as a Service for Deep Learning
This service allows the deployment of an instance of MaaS_DL through ProActive Service Automation (PSA) Portal.
NOTE: More details about the related actions of this service can be found in the link:/doc/PML/PMLUserGuide.html#_via_service_automation_portal[PML Doc].
NOTE: More details about the related actions of this service can be found in the link:/doc/PAIO/PAIOUserGuide.html#_via_service_automation_portal[PAIO Doc].
The service is started using the following variables.
......@@ -864,7 +864,7 @@ The service is started using the following variables.
| Boolean
| `false`
| `PYTHON_ENTRYPOINT`
| This entry script starts the service and defines the different functions to deploy the model, scores the prediction requests based on the deployed model, and returns the results. This script is specific to your model. This file should be stored in the Catalog under the `model_as_service_resources` bucket. More information about this file can be found in the link:../PML/PMLUserGuide.html#_customize_the_service[Customize the Service] section.
| This entry script starts the service and defines the different functions to deploy the model, scores the prediction requests based on the deployed model, and returns the results. This script is specific to your model. This file should be stored in the Catalog under the `model_as_service_resources` bucket. More information about this file can be found in the link:../PAIO/PAIOUserGuide.html#_customize_the_service[Customize the Service] section.
| Yes
| String
| `dl_service`
......@@ -884,7 +884,7 @@ The service is started using the following variables.
| Boolean
| `true`
| `YAML_FILE`
| A YAML file that describes the OpenAPI Specification ver. 2 (known as Swagger Spec) of the service. This file should be stored in the catalog under the `model_as_service_resources` bucket. More information about the structure of this file can be found in the section link:../PML/PMLUserGuide.html#_customize_the_service[Customize the Service].
| A YAML file that describes the OpenAPI Specification ver. 2 (known as Swagger Spec) of the service. This file should be stored in the catalog under the `model_as_service_resources` bucket. More information about the structure of this file can be found in the section link:../PAIO/PAIOUserGuide.html#_customize_the_service[Customize the Service].
| Yes
| String
| `dl_service-api`
......
......@@ -31,10 +31,10 @@
</div>
</div>
<div class="sect2">
<h3 id="_proactive_machine_learning_pml"><a class="anchor" href="#_proactive_machine_learning_pml"></a><a class="link" href="#_proactive_machine_learning_pml">ProActive Machine Learning (PML)</a></h3>
<h3 id="_proactive_machine_learning_paio"><a class="anchor" href="#_proactive_machine_learning_paio"></a><a class="link" href="#_proactive_machine_learning_paio">ProActive AI Orchestration (PAIO)</a></h3>
<div class="ulist">
<ul>
<li><p><h4 id="_all_doc_pml_user_guide" style="display:inline;font-size:1em"><a href="../PML/PMLUserGuide.html#_all_doc_pml_user_guide">PML User Guide</a></h4>
<li><p><h4 id="_all_doc_paio_user_guide" style="display:inline;font-size:1em"><a href="../PAIO/PAIOUserGuide.html#_all_doc_paio_user_guide">PAIO User Guide</a></h4>
&nbsp;(a complete Data Science and Machine Learning platform, with Studio & MLOps)</p>
</li>
</ul>
......@@ -48,7 +48,7 @@
<a href="../javadoc/index.html?org/ow2/proactive/scheduler/rest/SchedulerClient.html">Scheduler Java</a>  
<a href="../javadoc/index.html?org/ow2/proactive/scheduler/common/job/TaskFlowJob.html">Workflow Creation Java</a>  
<a href="https://github.com/ow2-proactive/proactive-python-client#proactive-scheduler-client">Python Client</a>  
<a href="../PML/PMLUserGuide.html#_proactive_jupyter_kernel">Jupyter Kernel</a>
<a href="../PAIO/PAIOUserGuide.html#_proactive_jupyter_kernel">Jupyter Kernel</a>
<a href="https://www.activeeon.com/public_content/documentation/csharp-client/">.NET Client</a>
</strong></p>
</div>
......
......@@ -15,12 +15,12 @@
** link:PSA/PSAUserGuide.html[*Service Automation*] (PaaS On-Demand, Service deployment and management)
* link:admin/ProActiveAdminGuide.html[*PWS Admin Guide*] (Installation, Infrastructure & Nodes setup, Agents,…​)
=== ProActive Machine Learning (PML)
=== ProActive AI Orchestration (PAIO)
* link:PML/PMLUserGuide.html[*PML User Guide*] (​a complete Data Science and Machine Learning platform, with Studio & MLOps)
* link:PAIO/PAIOUserGuide.html[*PAIO User Guide*] (​a complete Data Science and Machine Learning platform, with Studio & MLOps)
=== API documentation
link:rest/[*Scheduler REST*] &emsp; link:user/ProActiveUserGuide.html#_scheduler_graphql_api[*Scheduler GraphQL*] &emsp; link:user/ProActiveUserGuide.html#_scheduler_command_line[*Scheduler CLI*] &emsp; link:javadoc/index.html?org/ow2/proactive/scheduler/rest/SchedulerClient.html[*Scheduler Java*] &emsp; link:javadoc/index.html?org/ow2/proactive/scheduler/common/job/TaskFlowJob.html[*Workflow Creation Java*] &emsp; https://github.com/ow2-proactive/proactive-python-client#proactive-scheduler-client[*Python Client*] &emsp; link:PML/PMLUserGuide.html#_proactive_jupyter_kernel[*Jupyter Kernel*] &emsp; https://www.activeeon.com/public_content/documentation/csharp-client/[*.NET Client*]
link:rest/[*Scheduler REST*] &emsp; link:user/ProActiveUserGuide.html#_scheduler_graphql_api[*Scheduler GraphQL*] &emsp; link:user/ProActiveUserGuide.html#_scheduler_command_line[*Scheduler CLI*] &emsp; link:javadoc/index.html?org/ow2/proactive/scheduler/rest/SchedulerClient.html[*Scheduler Java*] &emsp; link:javadoc/index.html?org/ow2/proactive/scheduler/common/job/TaskFlowJob.html[*Workflow Creation Java*] &emsp; https://github.com/ow2-proactive/proactive-python-client#proactive-scheduler-client[*Python Client*] &emsp; link:PAIO/PAIOUserGuide.html#_proactive_jupyter_kernel[*Jupyter Kernel*] &emsp; https://www.activeeon.com/public_content/documentation/csharp-client/[*.NET Client*]
== Links
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment