Skip to content

Commit

Permalink
Fix readme typos
Browse files Browse the repository at this point in the history
  • Loading branch information
Eddasol committed Jan 30, 2025
1 parent c087e2c commit d26acac
Showing 1 changed file with 25 additions and 25 deletions.
50 changes: 25 additions & 25 deletions backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,20 +28,20 @@ Useful documentation of concepts and features in the .NET frameworks can be foun

## Setup

To set up the backend on **Windows/Mac**, install visual studio and include the "ASP.NET and web development" workload during install.
If you already have visual studio installed, you can open the "Visual Studio Installer" and modify your install to add the workload.
To set up the backend on **Windows/Mac**, install Visual Studio and include the "ASP.NET and web development" workload during install.
If you already have Visual Studio installed, you can open the "Visual Studio Installer" and modify your install to add the workload.

To set up the backend on **Linux**, install .NET for linux
To set up the backend on **Linux**, install .NET for Linux
[here](https://docs.microsoft.com/en-us/dotnet/core/install/linux).
You need to also install the dev certificate for local .NET development on linux.
You need to also install the dev certificate for local .NET development on Linux.
Follow
[this guide](https://learn.microsoft.com/en-us/aspnet/core/security/enforcing-ssl?view=aspnetcore-7.0&tabs=visual-studio%2Clinux-ubuntu#trust-https-certificate-on-linux),
for each of the browser(s) you wish to trust it in.
**NB:** You probably need to run the commands with `sudo` prefixed to have permission to change them.

For the configuration to be able to read secrets from the keyvault, you will need to have the client secret stored locally in your secret manager.

For the MQTT client to function, the application expects a config variable in the MQTT section called `Password`, containing the password for the mqtt broker.
For the MQTT client to function, the application expects a config variable in the MQTT section called `Password`, containing the password for the MQTT broker.
This must either be stored in a connected keyvault as "Mqtt--Password" or in the ASP.NET secret manager
as described in the [configuration section](#Configuration).

Expand Down Expand Up @@ -97,27 +97,27 @@ and runs as an ASP.NET
Each MQTT message has its own class representation, and is linked to its respective topic pattern in [MqttTopics.cs](api/MQTT/MqttTopics.cs).
To match incoming topic messages against the topic patterns we use helper functions to convert from the
[MQTT wildcards](https://docs.oasis-open.org/mqtt/mqtt/v5.0/os/mqtt-v5.0-os.html#_Toc3901242)
to regEx wildcards for the dictionnary lookup.
to regEx wildcards for the dictionary lookup.

Each topic then has it's respective [event](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/events/)
Each topic then has its respective [event](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/events/)
which is triggered whenever a new message arrives in that topic.
The list of topics being subscribe to is defined as an array in
The list of topics being subscribed to is defined as an array in
[appsettings.Development.json](api/appsettings.Development.json).

An example of the subscriber pattern for an MQTT event is implemented in
[MqttEvenHandler.cs](api/EventHandlers/MqttEventHandler.cs).
[MqttEventHandler.cs](api/EventHandlers/MqttEventHandler.cs).

## Configuration

The project has two [appsettings](https://docs.microsoft.com/en-us/iis-administration/configuration/appsettings.json)
files.
The base `appsettings.json` file is for common variables across all environments, while the
`appsettings.Development.json` file is for variables specific to the Dev environments, such as the client ID's for the
`appsettings.Development.json` file is for variables specific to the Dev environments, such as the client IDs for the
various app registrations used in development.

The configuration will also read from a configured azure keyvault, which can then be accessed the same way as any other config variables.
The configuration will also read from a configured Azure keyvault, which can then be accessed the same way as any other config variables.
For this to work you will need to have the client secret stored locally in the secret manager as described below.
The client secret (and mqtt password if not connected to keyvault) should be in the following format:
The client secret (and MQTT password if not connected to keyvault) should be in the following format:

```
"AzureAd": {
Expand Down Expand Up @@ -149,7 +149,7 @@ dotnet tool install --global dotnet-ef

### Adding a new migration

**NB: Make sure you have have fetched the newest code from main and that no-one else
**NB: Make sure you have fetched the newest code from main and that no one else
is making migrations at the same time as you!**

1. Set the environment variable `ASPNETCORE_ENVIRONMENT` to `Development`:
Expand All @@ -170,7 +170,7 @@ is making migrations at the same time as you!**
- The `your-migration-name-here` is basically a database commit message.
- `Database__ConnectionString` will be fetched from the keyvault when running the `add` command.
- `add` will _not_ update or alter the connected database in any way, but will add a
description of the changes that will be applied later
description of the changes that will be applied later.
- If you for some reason are unhappy with your migration, you can delete it with:
```bash
dotnet ef migrations remove
Expand All @@ -180,7 +180,7 @@ is making migrations at the same time as you!**

### Applying the migrations to the dev database

Updates to the database structure (applying migrations) are done in Github Actions.
Updates to the database structure (applying migrations) are done in GitHub Actions.

When a pull request contains changes in the `backend/api/Database/Migrations` folder,
[a workflow](https://github.com/equinor/flotilla/blob/main/.github/workflows/notifyMigrationChanges.yml)
Expand All @@ -191,7 +191,7 @@ After the pull request is approved, a user can then trigger the database changes

This will trigger
[another workflow](https://github.com/equinor/flotilla/blob/main/.github/workflows/updateDatabase.yml)
which updates the database by apploying the new migrations.
which updates the database by applying the new migrations.

By doing migrations this way, we ensure that the commands themselves are scripted, and that the database
changes become part of the review process of a pull request.
Expand All @@ -204,28 +204,28 @@ and [promoteToStaging](https://github.com/equinor/flotilla/blob/main/.github/wor

## Database setup

If resetting database, but still using postgresql (removing old migrations and adding them manually again).
If resetting database, but still using PostgreSQL (removing old migrations and adding them manually again).

## Database backup and cloning

You can use pg_dump to extract a PostgreSQL database into an sql file and psql to import the data into the target database from that file. Have the server running on pgAdmin and then execute the following commands.
You can use pg_dump to extract a PostgreSQL database into an SQL file and psql to import the data into the target database from that file. Have the server running on pgAdmin and then execute the following commands.

Extract the entire database:

```
pg_dump -U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.slq
pg_dump -U Username -d postgres -h host_name_or_address -p port -f output_file_name.sql
```

Extract specific tables:

```
pg_dump -U Username -d postgres -h host_name_or_adress -p port -t '"table_name"' -t '"second_table_name"' -f input_file_name.slq
pg_dump -U Username -d postgres -h host_name_or_address -p port -t '"table_name"' -t '"second_table_name"' -f input_file_name.sql
```

Upload file information to new database:

```
psql U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.slq
psql -U Username -d postgres -h host_name_or_address -p port -f output_file_name.sql
```

## Formatting
Expand All @@ -234,7 +234,7 @@ psql U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.sl

The formatting of the backend is defined in the [.editorconfig file](../.editorconfig).

In everyday development we use [CSharpier](https://csharpier.com/) to auto-format code on save. Installation procedure is described [here](https://csharpier.com/docs/About). No configuration should be required. To run csharpier locally, go to the backend folder and run:
In everyday development we use [CSharpier](https://csharpier.com/) to auto-format code on save. Installation procedure is described [here](https://csharpier.com/docs/About). No configuration should be required. To run CSharpier locally, go to the backend folder and run:
`dotnet csharpier . --check`

## SignalR
Expand All @@ -244,7 +244,7 @@ events and not receiving them, and all transmissions are sent using the SignalRS
doing so it is important to make sure that the event name provided corresponds with the name expected
in the frontend.

It is also crucial that we do not await sending signalR messages in our code. Instead we ignore the
It is also crucial that we do not await sending SignalR messages in our code. Instead we ignore the
await warning. In the current version of the SignalR library, sending a message in an
asynchronous thread may cause the thread to silently exit without returning an exception, which is
avoided by letting the SignalR code run asynchronously after the current thread has executed.
Expand All @@ -259,11 +259,11 @@ The connection strings for the AI instances are stored in the keyvault.

## Custom Mission Loaders

You can create your own mission loader to fetch missions from some external system. The custom mission loader needs to fulfill the [IMissionLoader](api/Services/MissionLoaders/MissionLoaderInterface.cs) interface. If you mission loader is an external API you might need to add it as a downstreamapi in [Program.cs](api/Program.cs)
You can create your own mission loader to fetch missions from some external system. The custom mission loader needs to fulfill the [IMissionLoader](api/Services/MissionLoaders/MissionLoaderInterface.cs) interface. If your mission loader is an external API you might need to add it as a downstream API in [Program.cs](api/Program.cs)

## Authorization

We use role based access control (RBAC) for authorization.
We use role-based access control (RBAC) for authorization.

The access matrix looks like this:

Expand Down

0 comments on commit d26acac

Please sign in to comment.