Commit b82d93e2 authored by Gilles Mouchard's avatar Gilles Mouchard
Browse files

Updated "Open source PKM server-side software".

parent e024e0b9
Pipeline #18081 passed with stages
in 9 minutes and 6 seconds
......@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="date" content='12/14/21'>
<meta name="date" content='12/16/21'>
<title>D1.5 Open source PKM server-side software</title>
<link rel="stylesheet" href='../style/style.css' type="text/css">
</head>
......@@ -20,7 +20,7 @@
<h2 id="configuration-optional">2.1 Configuration (optional)<a id="1"></a></h2>
<p>The PKM server has a configuration file, named <code>pkm_config.json</code>.</p>
<p>The PKM server comes with a default configuration file available on the <a href="https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/pkm_config.json">pkm-api gitlab repository</a>.</p>
<p>The default configuration file, which content is outlined below, is good for most people:</p>
<p>The default configuration file, whose content is outlined below, is good for most people:</p>
<pre><code>{
&quot;debug&quot;: true,
&quot;db_host&quot;: &quot;pkm-api_mongodb_1:27017&quot;,
......@@ -117,8 +117,8 @@ Commands:
uninstall
Uninstall DECODER EU Project Tool-chain (but preserve your saved preferences)</code></pre>
<h2 id="minimal-docker-installation">2.3 Minimal Docker installation<a id="3"></a></h2>
<p>The installation procedure in this section applies only to PKM and some parsers for which <a href="https://gitlab.ow2.org/decoder/pkm-api">pkm-api gitlab repository</a> hosts the source code. This installation procedure is <strong>not</strong> for all the docker images of the DECODER EU Project Tool-chain. For a complete installation, refers to Section 2.2.</p>
<p>The built and installed services are the followings:</p>
<p>The installation procedure in this section applies only to PKM and some parsers for which <a href="https://gitlab.ow2.org/decoder/pkm-api">pkm-api gitlab repository</a> hosts the source code. This installation procedure is <strong>not</strong> for all the docker images of the DECODER EU Project Tool-chain. For a complete installation, refer to Section 2.2.</p>
<p>The built and installed services are the following:</p>
<ul>
<li>PKM</li>
<li>Frama-C for PKM</li>
......@@ -127,8 +127,8 @@ Commands:
<li>UNISIM Excavator for PKM</li>
</ul>
<p>This minimal set of services handles the PKM (document management and querying) and parsing C, C++, .docx, and executable binary files.</p>
<h3 id="setting-up-administrators-credentials-optional">2.3.1 Setting up administrators credentials (optional)<a id="3.1"></a></h3>
<p>MongoDB and PKM require an initialization. When PKM and MongoDB services start for the time, the initialization starts automatically. Once initialized, the MongoDB and PKM services gracefully ignore later attempts to initialize. The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a “superuser”) in database admin (admin@admin), which can create users and roles in any MongoDB database, then enabling MongoDB security. The initialization of the PKM service consists in creating the PKM management database (where all the PKM users live and where the list of PKM projects is) and then an initial PKM administrator. That PKM administrator can later create other users and even other PKM administrators in the PKM management database. The PKM initialization runs on behalf of the MongoDB “superuser”.</p>
<h3 id="setting-up-administrators-credentials-optional">2.3.1 Setting up administrators credentials (optional)<a id="3.1"></a></h3>
<p>MongoDB and PKM require an initialization. When PKM and MongoDB services start for the first time, the initialization starts automatically. Once initialized, the MongoDB and PKM services gracefully ignore later attempts to initialize. The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a “superuser”) in database admin (admin@admin), who can create users and roles in any MongoDB database, then enabling MongoDB security. The initialization of the PKM service consists in creating the PKM management database (where all the PKM users live and where the list of PKM projects is) and then an initial PKM administrator. That PKM administrator can later create other users and even other PKM administrators in the PKM management database. The PKM initialization runs on behalf of the MongoDB “superuser”.</p>
<p>The credentials used during initialization are in <a href="https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/mongodb/credentials.json"><code>mongodb/credentials.json</code></a>:</p>
<pre><code>{
&quot;superuser&quot;: &quot;admin&quot;,
......@@ -161,7 +161,7 @@ Commands:
<pre><code>$ docker-compose up</code></pre>
</div>
<div id="footer">
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/14/21</span>
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/16/21</span>
</div>
</body>
</html>
......@@ -6,7 +6,7 @@ The PKM server has a configuration file, named `pkm_config.json`.
The PKM server comes with a default configuration file available on the [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/pkm_config.json).
The default configuration file, which content is outlined below, is good for most people:
The default configuration file, whose content is outlined below, is good for most people:
{
"debug": true,
......@@ -116,9 +116,9 @@ As soon as the graphical user interface has started in your default web browser,
The installation procedure in this section applies only to PKM and some parsers for which [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api) hosts the source code.
This installation procedure is **not** for all the docker images of the DECODER EU Project Tool-chain.
For a complete installation, refers to Section 2.2.
For a complete installation, refer to Section 2.2.
The built and installed services are the followings:
The built and installed services are the following:
* PKM
* Frama-C for PKM
......@@ -128,12 +128,12 @@ The built and installed services are the followings:
This minimal set of services handles the PKM (document management and querying) and parsing C, C++, .docx, and executable binary files.
### 2.3.1 Setting up administrators credentials (optional)<a id="3.1"></a>
### 2.3.1 Setting up administrators' credentials (optional)<a id="3.1"></a>
MongoDB and PKM require an initialization.
When PKM and MongoDB services start for the time, the initialization starts automatically.
When PKM and MongoDB services start for the first time, the initialization starts automatically.
Once initialized, the MongoDB and PKM services gracefully ignore later attempts to initialize.
The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a "superuser") in database admin (admin@admin), which can create users and roles in any MongoDB database, then enabling MongoDB security.
The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a "superuser") in database admin (admin@admin), who can create users and roles in any MongoDB database, then enabling MongoDB security.
The initialization of the PKM service consists in creating the PKM management database (where all the PKM users live and where the list of PKM projects is) and then an initial PKM administrator.
That PKM administrator can later create other users and even other PKM administrators in the PKM management database.
The PKM initialization runs on behalf of the MongoDB "superuser".
......
......@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="date" content='12/14/21'>
<meta name="date" content='12/16/21'>
<title>D1.5 Open source PKM server-side software</title>
<link rel="stylesheet" href='../style/style.css' type="text/css">
</head>
......@@ -27,7 +27,7 @@
<figure>
<img src="pkm-server-software-architecture.png" alt="Figure 3: PKM server software architecture" /><figcaption aria-hidden="true">Figure 3: PKM server software architecture</figcaption>
</figure>
<p>The bottom layer encompasses the underlying Node.js Javascript run-time, the MongoDB Node.js driver, the MongoDB database server, the local file system, and Git. On top of this, the <em>PKM util</em> layer is a kind of hardware abstraction layer that abstracts file system I/O and simplifies the process of tools execution. The <em>PKM core</em> provides Class <code>PKM</code>, which implements the <code>PKM</code> and is responsible of database accesses using the <em>MongoDB Node.js driver</em>. <em>PKM util</em> and <em>PKM core</em> constitute a Javascript SDK for the PKM whose purpose is mostly to allow third party contributors to extend the PKM. On the highest level, there are two kinds of front-ends:</p>
<p>The bottom layer encompasses the underlying Node.js Javascript run-time, the MongoDB Node.js driver, the MongoDB database server, the local file system, and Git. On top of this, the <em>PKM util</em> layer is a kind of hardware abstraction layer that abstracts file system I/O and simplifies the process of tools execution. The <em>PKM core</em> provides Class <code>PKM</code>, which implements the <code>PKM</code> and is responsible for database accesses using the <em>MongoDB Node.js driver</em>. <em>PKM util</em> and <em>PKM core</em> constitute a Javascript SDK for the PKM whose purpose is mostly to allow third party contributors to extend the PKM. On the highest level, there are two kinds of front-ends:</p>
<ul>
<li>the user’s console where the user (or scripts) can interact with the PKM,</li>
<li>and the user’s interface &amp; tools, which can communicate with the PKM.</li>
......@@ -46,7 +46,7 @@
<figure>
<img src="pkm-collections.png" alt="Figure 4: PKM MongoDB collections" /><figcaption aria-hidden="true">Figure 4: PKM MongoDB collections</figcaption>
</figure>
<p>The <code>Project</code> collection hosts the project metadata (name, members). Each project has a collection for Git working trees metadata, which the PKM server updates after running Git commands. Some collections track the tool executions (invocations and logs). The tool specifications enable GUI front-ends to create well-formed tool invocations (for the Process Engine). The methodology have a collection for the status of each phase of the methodology. The programming artefacts have both collections for files (both executable binaries and source codes), for compile commands (with compilation flags and options), and (after parsing) for source code Abstract Syntax Trees (ASTs), comments, and annotations. The models have also collections for both the UML files, and (after parsing) for UML class diagrams and state machines. The documentation have a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL). Links discovering tool (Semantic parsing) populates the 2D traceability matrix, while Semantic Role Labeling (SRL) and Named Entity Recognition (NER) tools extract information and synthesize Annotations in a dedicated collection. The TESTAR tool (automated GUI testing) has specialized collections for settings, models, and results. Finally, there are collections for Common Vulnerabilities and Exposures, and the reviews. Note that from the PKM REST API point of view the tool specifications, the TESTAR settings, and the methodology status are also properties of the project, even if in reality they are stored in separate collections. The PKM populates, at project creation, the initial methodology status and a predefined set of tool specifications.</p>
<p>The <code>Project</code> collection hosts the project metadata (name, members). Each project has a collection for Git working trees metadata, which the PKM server updates after running Git commands. Some collections track the tool executions (invocations and logs). The tool specifications enable GUI front-ends to create well-formed tool invocations (for the Process Engine). The methodology has a collection for the status of each phase of the methodology. The programming artefacts have both collections for files (both executable binaries and source codes), for compile commands (with compilation flags and options), and (after parsing) for source code Abstract Syntax Trees (ASTs), comments, and annotations. The models have also collections for both the UML files, and (after parsing) for UML class diagrams and state machines. The documentation has a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL). Links discovering tool (Semantic parsing) populates the 2D traceability matrix, while Semantic Role Labeling (SRL) and Named Entity Recognition (NER) tools extract information and synthesize Annotations in a dedicated collection. The TESTAR tool (automated GUI testing) has specialized collections for settings, models, and results. Finally, there are collections for Common Vulnerabilities and Exposures, and the reviews. Note that from the PKM REST API point of view the tool specifications, the TESTAR settings, and the methodology status are also properties of the project, even if in reality they are stored in separate collections. The PKM populates, at project creation, the initial methodology status and a predefined set of tool specifications.</p>
<p>The following subsections detail the collections in a PKM project, which table below summarizes:</p>
<table>
<thead>
......@@ -1121,7 +1121,7 @@
}
}</code></pre>
<h3 id="compile-commands">3.2.6 Compile commands <a id="2.6"></a></h3>
<p>The collection for compile commands, which contain the compiler command options and flags for each compilation units, is <code>'CompileCommands'</code>. The PKM compile commands derives from <a href="https://clang.llvm.org/docs/JSONCompilationDatabase.html">LLVM compile commands</a> with one exception: the <code>directory</code> and <code>file</code> properties are path relative to the virtual root directory of the PKM project.</p>
<p>The collection for compile commands, which contain the compiler command options and flags for each compilation units, is <code>'CompileCommands'</code>. The PKM compile commands derives from <a href="https://clang.llvm.org/docs/JSONCompilationDatabase.html">LLVM compile commands</a> with one exception: the <code>directory</code> and <code>file</code> properties are paths relative to the virtual root directory of the PKM project.</p>
<p>The complete schema for compile commands can be downloaded from the <a href="https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/api/pkm-compile-command-schema.json">pkm-api gitlab repository</a>. The schema is the following:</p>
<pre><code>{
&quot;$schema&quot;: &quot;http://json-schema.org/draft-07/schema#&quot;,
......@@ -1919,7 +1919,7 @@ PKM.login(&#39;garfield&#39;, &#39;password&#39;, config).then((pkm) =&gt;
console.error(err);
process.exit(1);
});</code></pre>
<p>When the PKM is running server side and a user is willing to access the PKM through the Internet using a web browser, he needs to authenticate once and then can request for PKM services many times before session expires. To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), which validity lasts for a certain amount of time.</p>
<p>When the PKM is running server side and a user is willing to access the PKM through the Internet using a web browser, he needs to authenticate once and then can request for PKM services many times before session expires. To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), whose validity lasts for a certain amount of time.</p>
<p>Server side code for user’s authentication looks like below:</p>
<pre><code>const PKM = require(&#39;./core/pkm&#39;);
const config = {
......@@ -2063,7 +2063,7 @@ PKM.logout(key).then((pkm) =&gt;
<p>The <em>PKM REST API</em> provides access to the PKM over HTTP/HTTPS. The PKM has an <a href="https://www.openapis.org">OpenAPI</a> 3 specification available at <a href="https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/api/pkm-openapi.yaml">https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/api/pkm-openapi.yaml</a>, which enables to automatically generate the SDK for many programming languages, see Appendix A.2. Appendix A.2 contains detailed explanations about implementation design of the REST server, which provides front-ends and third party tools developers with the PKM REST API.</p>
</div>
<div id="footer">
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/14/21</span>
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/16/21</span>
</div>
</body>
</html>
......@@ -15,7 +15,7 @@ Figure 3 below shows the layered software architecture of the PKM server and its
The bottom layer encompasses the underlying Node.js Javascript run-time, the MongoDB Node.js driver, the MongoDB database server, the local file system, and Git.
On top of this, the *PKM util* layer is a kind of hardware abstraction layer that abstracts file system I/O and simplifies the process of tools execution.
The *PKM core* provides Class `PKM`, which implements the `PKM` and is responsible of database accesses using the *MongoDB Node.js driver*.
The *PKM core* provides Class `PKM`, which implements the `PKM` and is responsible for database accesses using the *MongoDB Node.js driver*.
*PKM util* and *PKM core* constitute a Javascript SDK for the PKM whose purpose is mostly to allow third party contributors to extend the PKM.
On the highest level, there are two kinds of front-ends:
......@@ -51,10 +51,10 @@ The `Project` collection hosts the project metadata (name, members).
Each project has a collection for Git working trees metadata, which the PKM server updates after running Git commands.
Some collections track the tool executions (invocations and logs).
The tool specifications enable GUI front-ends to create well-formed tool invocations (for the Process Engine).
The methodology have a collection for the status of each phase of the methodology.
The methodology has a collection for the status of each phase of the methodology.
The programming artefacts have both collections for files (both executable binaries and source codes), for compile commands (with compilation flags and options), and (after parsing) for source code Abstract Syntax Trees (ASTs), comments, and annotations.
The models have also collections for both the UML files, and (after parsing) for UML class diagrams and state machines.
The documentation have a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL).
The documentation has a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL).
Links discovering tool (Semantic parsing) populates the 2D traceability matrix, while Semantic Role Labeling (SRL) and Named Entity Recognition (NER) tools extract information and synthesize Annotations in a dedicated collection.
The TESTAR tool (automated GUI testing) has specialized collections for settings, models, and results.
Finally, there are collections for Common Vulnerabilities and Exposures, and the reviews.
......@@ -1105,7 +1105,7 @@ The main structure of the schema is the following:
### 3.2.6 Compile commands <a id="2.6"></a>
The collection for compile commands, which contain the compiler command options and flags for each compilation units, is `'CompileCommands'`.
The PKM compile commands derives from [LLVM compile commands](https://clang.llvm.org/docs/JSONCompilationDatabase.html) with one exception: the `directory` and `file` properties are path relative to the virtual root directory of the PKM project.
The PKM compile commands derives from [LLVM compile commands](https://clang.llvm.org/docs/JSONCompilationDatabase.html) with one exception: the `directory` and `file` properties are paths relative to the virtual root directory of the PKM project.
The complete schema for compile commands can be downloaded from the [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/api/pkm-compile-command-schema.json).
The schema is the following:
......@@ -1975,7 +1975,7 @@ Below is an example for login:
});
When the PKM is running server side and a user is willing to access the PKM through the Internet using a web browser, he needs to authenticate once and then can request for PKM services many times before session expires.
To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), which validity lasts for a certain amount of time.
To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), whose validity lasts for a certain amount of time.
Server side code for user's authentication looks like below:
......
......@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="date" content='12/14/21'>
<meta name="date" content='12/16/21'>
<title>D1.5 Open source PKM server-side software</title>
<link rel="stylesheet" href='../style/style.css' type="text/css">
</head>
......@@ -97,7 +97,7 @@
<li>then delete files in the database which are no longer in the Git working directory (*)</li>
</ul></li>
</ol>
<p>Steps 1 to 4 retrieve the user’s credentials and prepare the disk storage for the execution of the Git commands. These steps are essentially about dumping the dirty files from the PKM on the disk, and possibly deleting the files that are no longer in the PKM. Step 5 is about running the Git commands, which effect is making some directories and files on the disk to appear or disappear. Steps 6 to 11 update the PKM which is essentially filling the PKM with the on-disk files, and deleting files from the PKM which are no longer on the disk.</p>
<p>Steps 1 to 4 retrieve the user’s credentials and prepare the disk storage for the execution of the Git commands. These steps are essentially about dumping the dirty files from the PKM on the disk, and possibly deleting the files that are no longer in the PKM. Step 5 is about running the Git commands, whose effect is making some directories and files on the disk to appear or disappear. Steps 6 to 11 update the PKM which is essentially filling the PKM with the on-disk files, and deleting files from the PKM which are no longer on the disk.</p>
<p>The actual implementation of this algorithm has some tuning options (<code>dont_delete_pkm_files</code> and <code>dont_delete_git_working_tree_files</code>), to support a lazy synchronization between the PKM (the database) and the Git working trees, to avoid deleting files on one side which have disappeared on the other side, see (*) in the algorithm.</p>
<p><strong>Poll a Git job</strong></p>
<pre><code>GET /git/job/{jobId}</code></pre>
......@@ -136,7 +136,7 @@ DELETE /git/working_trees/{dbName}/{gitWorkingTree}?dontDeletePkmFiles=…</code
<p>These operations allow deleting Git working trees created with <code>'clone'</code>. These operations can also delete the corresponding files in the PKM. There is no other way to delete Git working trees, and the only way to create a new Git working tree is to run a <code>clone</code> command.</p>
</div>
<div id="footer">
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/14/21</span>
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/16/21</span>
</div>
</body>
</html>
......@@ -140,7 +140,7 @@ An overview of the algorithm for the `run` operation is the following:
Steps 1 to 4 retrieve the user's credentials and prepare the disk storage for the execution of the Git commands.
These steps are essentially about dumping the dirty files from the PKM on the disk, and possibly deleting the files that are no longer in the PKM.
Step 5 is about running the Git commands, which effect is making some directories and files on the disk to appear or disappear.
Step 5 is about running the Git commands, whose effect is making some directories and files on the disk to appear or disappear.
Steps 6 to 11 update the PKM which is essentially filling the PKM with the on-disk files, and deleting files from the PKM which are no longer on the disk.
The actual implementation of this algorithm has some tuning options (`dont_delete_pkm_files` and `dont_delete_git_working_tree_files`), to support a lazy synchronization between the PKM (the database) and the Git working trees, to avoid deleting files on one side which have disappeared on the other side, see (*) in the algorithm.
......
......@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="date" content='12/14/21'>
<meta name="date" content='12/16/21'>
<title>D1.5 Open source PKM server-side software</title>
<link rel="stylesheet" href='../style/style.css' type="text/css">
</head>
......@@ -55,7 +55,7 @@
<li>UNISIM Excavator for PKM</li>
</ul>
<h2 id="a.3-integration-non-regression-testing">A.3 Integration &amp; non-regression testing <a id="3"></a></h2>
<p>There is a set of benchmarks for the DECODER project. These benchmarks are tests mostly intended for integration testing and non-regression testing of the DECODER Project tool-chain. There are not only basic tests but also more complex tests with software from the real world, such as OpenCV and Linux. They have been used extensively to ensure that the server is performing its missions correctly and scaling up in realistic scenarios.</p>
<p>There is a set of benchmarks for the DECODER project. These benchmarks are tests mostly intended for integration testing and non-regression testing of the DECODER Project tool-chain. There are not only basic tests but also more complex tests with software from the real world and DECODER test cases, such as OpenCV and Linux driver. They have been used extensively to ensure that the server is performing its missions correctly and scaling up in realistic scenarios.</p>
<p>These benchmarks are available at <a href="https://gitlab.ow2.org/decoder/integration-tests">https://gitlab.ow2.org/decoder/integration-tests</a>.</p>
<p>Each test has a bash script (<code>run.sh</code>) to run the test and a dataset (in directory <code>data</code>).</p>
<p>The currently available tests are:</p>
......@@ -73,7 +73,7 @@
<p>The tests use the popular <a href="https://curl.se/">cURL</a> program to communicate with the PKM and tools through their REST APIs.</p>
</div>
<div id="footer">
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/14/21</span>
<span><a href="../index.html">D1.5 Open source PKM server-side software</a> - 12/16/21</span>
</div>
</body>
</html>
......@@ -54,7 +54,7 @@ The services that currently use OpenAPI Generator are the followings:
There is a set of benchmarks for the DECODER project.
These benchmarks are tests mostly intended for integration testing and non-regression testing of the DECODER Project tool-chain.
There are not only basic tests but also more complex tests with software from the real world, such as OpenCV and Linux.
There are not only basic tests but also more complex tests with software from the real world and DECODER test cases, such as OpenCV and Linux driver.
They have been used extensively to ensure that the server is performing its missions correctly and scaling up in realistic scenarios.
These benchmarks are available at [https://gitlab.ow2.org/decoder/integration-tests](https://gitlab.ow2.org/decoder/integration-tests).
......
......@@ -2,7 +2,7 @@
<html lang="en">
<head>
<meta charset="utf-8">
<meta name="date" content='12/14/21'>
<meta name="date" content='12/16/21'>
<title>D1.5 Open source PKM server-side software</title>
<link rel="stylesheet" href='style/style.css' type="text/css">
</head>
......@@ -34,7 +34,7 @@
<li><a href="2/index.html#2">Easy installation (no build required)</a></li>
<li><a href="2/index.html#3">Minimal Docker installation</a>
<ol type="1">
<li><a href="2/index.html#3.1">Setting up administrators credentials (optional)</a></li>
<li><a href="2/index.html#3.1">Setting up administrators credentials (optional)</a></li>
<li><a href="2/index.html#3.2">Building the docker images</a></li>
<li><a href="2/index.html#3.3">Starting the docker services</a></li>
</ol></li>
......@@ -133,7 +133,7 @@
</ul>
</div>
<div id="footer">
<span><a href="index.html">D1.5 Open source PKM server-side software</a> - 12/14/21</span>
<span><a href="index.html">D1.5 Open source PKM server-side software</a> - 12/16/21</span>
</div>
</body>
</html>
......@@ -11,7 +11,7 @@
1. [Configuration (optional)](2/index.md#1)
2. [Easy installation (no build required)](2/index.md#2)
3. [Minimal Docker installation](2/index.md#3)
1. [Setting up administrators credentials (optional)](2/index.md#3.1)
1. [Setting up administrators' credentials (optional)](2/index.md#3.1)
2. [Building the docker images](2/index.md#3.2)
3. [Starting the docker services](2/index.md#3.3)
3. [PKM server architecture](3/index.md)
......
......@@ -6,7 +6,7 @@ The PKM server has a configuration file, named `pkm_config.json`.
The PKM server comes with a default configuration file available on the [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/pkm_config.json).
The default configuration file, which content is outlined below, is good for most people:
The default configuration file, whose content is outlined below, is good for most people:
%deps "${BIN}/json_to_md" "${PKM_API}/pkm_config.json"
%"${BIN}/json_to_md" --verbatim --max-depth 0 "${PKM_API}/pkm_config.json"
......@@ -103,9 +103,9 @@ As soon as the graphical user interface has started in your default web browser,
The installation procedure in this section applies only to PKM and some parsers for which [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api) hosts the source code.
This installation procedure is **not** for all the docker images of the DECODER EU Project Tool-chain.
For a complete installation, refers to Section 2.2.
For a complete installation, refer to Section 2.2.
The built and installed services are the followings:
The built and installed services are the following:
* PKM
* Frama-C for PKM
......@@ -115,12 +115,12 @@ The built and installed services are the followings:
This minimal set of services handles the PKM (document management and querying) and parsing C, C++, .docx, and executable binary files.
### 2.3.1 Setting up administrators credentials (optional)<a id="3.1"></a>
### 2.3.1 Setting up administrators' credentials (optional)<a id="3.1"></a>
MongoDB and PKM require an initialization.
When PKM and MongoDB services start for the time, the initialization starts automatically.
When PKM and MongoDB services start for the first time, the initialization starts automatically.
Once initialized, the MongoDB and PKM services gracefully ignore later attempts to initialize.
The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a "superuser") in database admin (admin@admin), which can create users and roles in any MongoDB database, then enabling MongoDB security.
The stock MongoDB database is virgin with security disabled, so the initialization of the MongoDB service consists in creating an administrator (a "superuser") in database admin (admin@admin), who can create users and roles in any MongoDB database, then enabling MongoDB security.
The initialization of the PKM service consists in creating the PKM management database (where all the PKM users live and where the list of PKM projects is) and then an initial PKM administrator.
That PKM administrator can later create other users and even other PKM administrators in the PKM management database.
The PKM initialization runs on behalf of the MongoDB "superuser".
......
......@@ -15,7 +15,7 @@ Figure 3 below shows the layered software architecture of the PKM server and its
The bottom layer encompasses the underlying Node.js Javascript run-time, the MongoDB Node.js driver, the MongoDB database server, the local file system, and Git.
On top of this, the *PKM util* layer is a kind of hardware abstraction layer that abstracts file system I/O and simplifies the process of tools execution.
The *PKM core* provides Class `PKM`, which implements the `PKM` and is responsible of database accesses using the *MongoDB Node.js driver*.
The *PKM core* provides Class `PKM`, which implements the `PKM` and is responsible for database accesses using the *MongoDB Node.js driver*.
*PKM util* and *PKM core* constitute a Javascript SDK for the PKM whose purpose is mostly to allow third party contributors to extend the PKM.
On the highest level, there are two kinds of front-ends:
......@@ -51,10 +51,10 @@ The `Project` collection hosts the project metadata (name, members).
Each project has a collection for Git working trees metadata, which the PKM server updates after running Git commands.
Some collections track the tool executions (invocations and logs).
The tool specifications enable GUI front-ends to create well-formed tool invocations (for the Process Engine).
The methodology have a collection for the status of each phase of the methodology.
The methodology has a collection for the status of each phase of the methodology.
The programming artefacts have both collections for files (both executable binaries and source codes), for compile commands (with compilation flags and options), and (after parsing) for source code Abstract Syntax Trees (ASTs), comments, and annotations.
The models have also collections for both the UML files, and (after parsing) for UML class diagrams and state machines.
The documentation have a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL).
The documentation has a collection for the documentation files (e.g. .docx files), for the Abstract Semi-Formal Models (ASFMs), which Doc to ASFM can generate from .docx files, and for the Graphical documentation written in the Graphical Specification Language (GSL).
Links discovering tool (Semantic parsing) populates the 2D traceability matrix, while Semantic Role Labeling (SRL) and Named Entity Recognition (NER) tools extract information and synthesize Annotations in a dedicated collection.
The TESTAR tool (automated GUI testing) has specialized collections for settings, models, and results.
Finally, there are collections for Common Vulnerabilities and Exposures, and the reviews.
......@@ -267,7 +267,7 @@ The main structure of the schema is the following:
### 3.2.6 Compile commands <a id="2.6"></a>
The collection for compile commands, which contain the compiler command options and flags for each compilation units, is `'CompileCommands'`.
The PKM compile commands derives from [LLVM compile commands](https://clang.llvm.org/docs/JSONCompilationDatabase.html) with one exception: the `directory` and `file` properties are path relative to the virtual root directory of the PKM project.
The PKM compile commands derives from [LLVM compile commands](https://clang.llvm.org/docs/JSONCompilationDatabase.html) with one exception: the `directory` and `file` properties are paths relative to the virtual root directory of the PKM project.
The complete schema for compile commands can be downloaded from the [pkm-api gitlab repository](https://gitlab.ow2.org/decoder/pkm-api/-/blob/master/api/pkm-compile-command-schema.json).
The schema is the following:
......@@ -478,7 +478,7 @@ Below is an example for login:
});
When the PKM is running server side and a user is willing to access the PKM through the Internet using a web browser, he needs to authenticate once and then can request for PKM services many times before session expires.
To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), which validity lasts for a certain amount of time.
To identify an authenticated user while maintaining security, PKM relies on a key (generated by the PKM server), whose validity lasts for a certain amount of time.
Server side code for user's authentication looks like below:
......
......@@ -140,7 +140,7 @@ An overview of the algorithm for the `run` operation is the following:
Steps 1 to 4 retrieve the user's credentials and prepare the disk storage for the execution of the Git commands.
These steps are essentially about dumping the dirty files from the PKM on the disk, and possibly deleting the files that are no longer in the PKM.
Step 5 is about running the Git commands, which effect is making some directories and files on the disk to appear or disappear.
Step 5 is about running the Git commands, whose effect is making some directories and files on the disk to appear or disappear.
Steps 6 to 11 update the PKM which is essentially filling the PKM with the on-disk files, and deleting files from the PKM which are no longer on the disk.
The actual implementation of this algorithm has some tuning options (`dont_delete_pkm_files` and `dont_delete_git_working_tree_files`), to support a lazy synchronization between the PKM (the database) and the Git working trees, to avoid deleting files on one side which have disappeared on the other side, see (*) in the algorithm.
......
......@@ -54,7 +54,7 @@ The services that currently use OpenAPI Generator are the followings:
There is a set of benchmarks for the DECODER project.
These benchmarks are tests mostly intended for integration testing and non-regression testing of the DECODER Project tool-chain.
There are not only basic tests but also more complex tests with software from the real world, such as OpenCV and Linux.
There are not only basic tests but also more complex tests with software from the real world and DECODER test cases, such as OpenCV and Linux driver.
They have been used extensively to ensure that the server is performing its missions correctly and scaling up in realistic scenarios.
These benchmarks are available at [https://gitlab.ow2.org/decoder/integration-tests](https://gitlab.ow2.org/decoder/integration-tests).
......
......@@ -11,7 +11,7 @@
1. [Configuration (optional)](2/index.md#1)
2. [Easy installation (no build required)](2/index.md#2)
3. [Minimal Docker installation](2/index.md#3)
1. [Setting up administrators credentials (optional)](2/index.md#3.1)
1. [Setting up administrators' credentials (optional)](2/index.md#3.1)
2. [Building the docker images](2/index.md#3.2)
3. [Starting the docker services](2/index.md#3.3)
3. [PKM server architecture](3/index.md)
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment