From d26acacc70b2a9a6cddd3d411342e9757c683379 Mon Sep 17 00:00:00 2001 From: Eddasol Date: Thu, 30 Jan 2025 13:58:25 +0100 Subject: [PATCH] Fix readme typos --- backend/README.md | 50 +++++++++++++++++++++++------------------------ 1 file changed, 25 insertions(+), 25 deletions(-) diff --git a/backend/README.md b/backend/README.md index 50946c01..d254f95d 100644 --- a/backend/README.md +++ b/backend/README.md @@ -28,12 +28,12 @@ Useful documentation of concepts and features in the .NET frameworks can be foun ## Setup -To set up the backend on **Windows/Mac**, install visual studio and include the "ASP.NET and web development" workload during install. -If you already have visual studio installed, you can open the "Visual Studio Installer" and modify your install to add the workload. +To set up the backend on **Windows/Mac**, install Visual Studio and include the "ASP.NET and web development" workload during install. +If you already have Visual Studio installed, you can open the "Visual Studio Installer" and modify your install to add the workload. -To set up the backend on **Linux**, install .NET for linux +To set up the backend on **Linux**, install .NET for Linux [here](https://docs.microsoft.com/en-us/dotnet/core/install/linux). -You need to also install the dev certificate for local .NET development on linux. +You need to also install the dev certificate for local .NET development on Linux. Follow [this guide](https://learn.microsoft.com/en-us/aspnet/core/security/enforcing-ssl?view=aspnetcore-7.0&tabs=visual-studio%2Clinux-ubuntu#trust-https-certificate-on-linux), for each of the browser(s) you wish to trust it in. @@ -41,7 +41,7 @@ for each of the browser(s) you wish to trust it in. For the configuration to be able to read secrets from the keyvault, you will need to have the client secret stored locally in your secret manager. -For the MQTT client to function, the application expects a config variable in the MQTT section called `Password`, containing the password for the mqtt broker. +For the MQTT client to function, the application expects a config variable in the MQTT section called `Password`, containing the password for the MQTT broker. This must either be stored in a connected keyvault as "Mqtt--Password" or in the ASP.NET secret manager as described in the [configuration section](#Configuration). @@ -97,27 +97,27 @@ and runs as an ASP.NET Each MQTT message has its own class representation, and is linked to its respective topic pattern in [MqttTopics.cs](api/MQTT/MqttTopics.cs). To match incoming topic messages against the topic patterns we use helper functions to convert from the [MQTT wildcards](https://docs.oasis-open.org/mqtt/mqtt/v5.0/os/mqtt-v5.0-os.html#_Toc3901242) -to regEx wildcards for the dictionnary lookup. +to regEx wildcards for the dictionary lookup. -Each topic then has it's respective [event](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/events/) +Each topic then has its respective [event](https://docs.microsoft.com/en-us/dotnet/csharp/programming-guide/events/) which is triggered whenever a new message arrives in that topic. -The list of topics being subscribe to is defined as an array in +The list of topics being subscribed to is defined as an array in [appsettings.Development.json](api/appsettings.Development.json). An example of the subscriber pattern for an MQTT event is implemented in -[MqttEvenHandler.cs](api/EventHandlers/MqttEventHandler.cs). +[MqttEventHandler.cs](api/EventHandlers/MqttEventHandler.cs). ## Configuration The project has two [appsettings](https://docs.microsoft.com/en-us/iis-administration/configuration/appsettings.json) files. The base `appsettings.json` file is for common variables across all environments, while the -`appsettings.Development.json` file is for variables specific to the Dev environments, such as the client ID's for the +`appsettings.Development.json` file is for variables specific to the Dev environments, such as the client IDs for the various app registrations used in development. -The configuration will also read from a configured azure keyvault, which can then be accessed the same way as any other config variables. +The configuration will also read from a configured Azure keyvault, which can then be accessed the same way as any other config variables. For this to work you will need to have the client secret stored locally in the secret manager as described below. -The client secret (and mqtt password if not connected to keyvault) should be in the following format: +The client secret (and MQTT password if not connected to keyvault) should be in the following format: ``` "AzureAd": { @@ -149,7 +149,7 @@ dotnet tool install --global dotnet-ef ### Adding a new migration -**NB: Make sure you have have fetched the newest code from main and that no-one else +**NB: Make sure you have fetched the newest code from main and that no one else is making migrations at the same time as you!** 1. Set the environment variable `ASPNETCORE_ENVIRONMENT` to `Development`: @@ -170,7 +170,7 @@ is making migrations at the same time as you!** - The `your-migration-name-here` is basically a database commit message. - `Database__ConnectionString` will be fetched from the keyvault when running the `add` command. - `add` will _not_ update or alter the connected database in any way, but will add a - description of the changes that will be applied later + description of the changes that will be applied later. - If you for some reason are unhappy with your migration, you can delete it with: ```bash dotnet ef migrations remove @@ -180,7 +180,7 @@ is making migrations at the same time as you!** ### Applying the migrations to the dev database -Updates to the database structure (applying migrations) are done in Github Actions. +Updates to the database structure (applying migrations) are done in GitHub Actions. When a pull request contains changes in the `backend/api/Database/Migrations` folder, [a workflow](https://github.com/equinor/flotilla/blob/main/.github/workflows/notifyMigrationChanges.yml) @@ -191,7 +191,7 @@ After the pull request is approved, a user can then trigger the database changes This will trigger [another workflow](https://github.com/equinor/flotilla/blob/main/.github/workflows/updateDatabase.yml) -which updates the database by apploying the new migrations. +which updates the database by applying the new migrations. By doing migrations this way, we ensure that the commands themselves are scripted, and that the database changes become part of the review process of a pull request. @@ -204,28 +204,28 @@ and [promoteToStaging](https://github.com/equinor/flotilla/blob/main/.github/wor ## Database setup -If resetting database, but still using postgresql (removing old migrations and adding them manually again). +If resetting database, but still using PostgreSQL (removing old migrations and adding them manually again). ## Database backup and cloning -You can use pg_dump to extract a PostgreSQL database into an sql file and psql to import the data into the target database from that file. Have the server running on pgAdmin and then execute the following commands. +You can use pg_dump to extract a PostgreSQL database into an SQL file and psql to import the data into the target database from that file. Have the server running on pgAdmin and then execute the following commands. Extract the entire database: ``` -pg_dump -U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.slq +pg_dump -U Username -d postgres -h host_name_or_address -p port -f output_file_name.sql ``` Extract specific tables: ``` -pg_dump -U Username -d postgres -h host_name_or_adress -p port -t '"table_name"' -t '"second_table_name"' -f input_file_name.slq +pg_dump -U Username -d postgres -h host_name_or_address -p port -t '"table_name"' -t '"second_table_name"' -f input_file_name.sql ``` Upload file information to new database: ``` -psql U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.slq +psql -U Username -d postgres -h host_name_or_address -p port -f output_file_name.sql ``` ## Formatting @@ -234,7 +234,7 @@ psql U Username -d postgres -h host_name_or_adress -p port -f ouput_file_name.sl The formatting of the backend is defined in the [.editorconfig file](../.editorconfig). -In everyday development we use [CSharpier](https://csharpier.com/) to auto-format code on save. Installation procedure is described [here](https://csharpier.com/docs/About). No configuration should be required. To run csharpier locally, go to the backend folder and run: +In everyday development we use [CSharpier](https://csharpier.com/) to auto-format code on save. Installation procedure is described [here](https://csharpier.com/docs/About). No configuration should be required. To run CSharpier locally, go to the backend folder and run: `dotnet csharpier . --check` ## SignalR @@ -244,7 +244,7 @@ events and not receiving them, and all transmissions are sent using the SignalRS doing so it is important to make sure that the event name provided corresponds with the name expected in the frontend. -It is also crucial that we do not await sending signalR messages in our code. Instead we ignore the +It is also crucial that we do not await sending SignalR messages in our code. Instead we ignore the await warning. In the current version of the SignalR library, sending a message in an asynchronous thread may cause the thread to silently exit without returning an exception, which is avoided by letting the SignalR code run asynchronously after the current thread has executed. @@ -259,11 +259,11 @@ The connection strings for the AI instances are stored in the keyvault. ## Custom Mission Loaders -You can create your own mission loader to fetch missions from some external system. The custom mission loader needs to fulfill the [IMissionLoader](api/Services/MissionLoaders/MissionLoaderInterface.cs) interface. If you mission loader is an external API you might need to add it as a downstreamapi in [Program.cs](api/Program.cs) +You can create your own mission loader to fetch missions from some external system. The custom mission loader needs to fulfill the [IMissionLoader](api/Services/MissionLoaders/MissionLoaderInterface.cs) interface. If your mission loader is an external API you might need to add it as a downstream API in [Program.cs](api/Program.cs) ## Authorization -We use role based access control (RBAC) for authorization. +We use role-based access control (RBAC) for authorization. The access matrix looks like this: