Skip to content

Commit

Permalink
Merge pull request #323 from amosproj/generalReadmes
Browse files Browse the repository at this point in the history
Update general setup readmes
  • Loading branch information
heskil authored Feb 4, 2025
2 parents ba10be9 + 36b3742 commit 020eeba
Show file tree
Hide file tree
Showing 3 changed files with 114 additions and 17 deletions.
91 changes: 78 additions & 13 deletions Documentation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,24 +2,89 @@

Software architecture description

## Basic setup:
## Requirements

```bash
npm ci
cd ./apps/analyzer/metadata_analyzer ; poetry install
```
- **Node 20 with npm**
- **Docker**
- **Docker Compose**
- **Python 3.11 and Poetry**

- `npm ci`: dependency install
### Build and run

- copy `.env.example` file in backend and rename to `.env` (adjust database properties according to database setup if necessary)
- copy `.env.example` file in analyzer and rename to `.env` (adjust port properties according to backend setup if necessary)
Now you have 2 options:
- 1. Build and run with docker
- 2. Build and run with nx (for local development)

### Running the code locally:
#### Docker Build Setup Instructions

- `npm run be`: run backend individually
- `npm run fe`: run frontend individually
- `npm run py` : run python app
- `npm run all`: run backend, frontend and python module
1. **Clone the repository**:

```bash
git clone https://github.com/amosproj/amos2024ws02-backup-metadata-analyzer.git

```

2. **Change directory**:

```bash
cd ./amos2024ws02-backup-metadata-analyzer/

```

3. **Setup .env files**:

In the projects root folder, copy the **.env.docker.example** and rename the copy to **.env.docker**


4. **Copy database dump into project**:

Copy the database dump .dmp file into the projects root folder and rename it to **db_dump.sql**

5. **Build Docker container**:

```bash
docker compose --env-file .env.docker build --no-cache

```

6. **Start Docker container**:

```bash
docker compose --env-file .env.docker up

```

7. **Stop Docker Container**:
```bash
docker compose --env-file .env.docker down
```


#### Local dev build and run instructions

- use `npm ci` to install local node dependencies
- cd into apps/analyzer/metadata_analyzer directory and run `poetry install` to install python venv and dependencies

- in `apps/backend` and `apps/analyzer`: copy the `.env.example` files and rename them to `.env`
- make sure you have postgres databases running on the connections defined in the `.env` files.
- the analyzer database should contain the database with backup metadata to be analyzed.
- the backend database should initially be empty and is used to store the analysis results.


(Suggestion) Use docker to provide the database(s):
- if you only want to provide the analyzer database or the backend database via docker, please change the commands accordingly.
- Prepare the .env.docker file (see step 3 of docker setup enstructions)
- `docker compose --env-file .env.docker build --no-cache backendDatabase analyzerDatabase`
- `docker compose --env-file .env.docker up backendDatabase analyzerDatabase`


If you have got the databases running:
- `npm run all` to run all modules at the same time

if you want to run the modules individually:
- `npm run py` to run the python analyzer
- `npm run be` to run the Typescript backend
- `npm run fe` to run the frontend

### Generating database migrations:

Expand Down
38 changes: 36 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,11 +4,18 @@

Make sure the following are installed on your machine:

- **Node 20**
- **Node 20 with npm**
- **Docker**
- **Docker Compose**
- **Python 3.11 and Poetry**

## Docker Build Setup Instructions
## Build and run

Now you have 2 options:
- 1. Build and run with docker
- 2. Build and run with nx (for local development)

### Docker Build Setup Instructions

1. **Clone the repository**:

Expand Down Expand Up @@ -51,3 +58,30 @@ Make sure the following are installed on your machine:
```bash
docker compose --env-file .env.docker down
```


### Local dev build and run instructions

- use `npm ci` to install local node dependencies
- cd into apps/analyzer/metadata_analyzer directory and run `poetry install` to install python venv and dependencies

- in `apps/backend` and `apps/analyzer`: copy the `.env.example` files and rename them to `.env`
- make sure you have postgres databases running on the connections defined in the `.env` files.
- the analyzer database should contain the database with backup metadata to be analyzed.
- the backend database should initially be empty and is used to store the analysis results.


(Suggestion) Use docker to provide the database(s):
- if you only want to provide the analyzer database or the backend database via docker, please change the commands accordingly.
- Prepare the .env.docker file (see step 3 of docker setup enstructions)
- `docker compose --env-file .env.docker build --no-cache backendDatabase analyzerDatabase`
- `docker compose --env-file .env.docker up backendDatabase analyzerDatabase`


If you have got the databases running:
- `npm run all` to run all modules at the same time

if you want to run the modules individually:
- `npm run py` to run the python analyzer
- `npm run be` to run the Typescript backend
- `npm run fe` to run the frontend
2 changes: 0 additions & 2 deletions apps/backend/src/app/utils/pagination/paginationService.ts
Original file line number Diff line number Diff line change
Expand Up @@ -244,8 +244,6 @@ export class PaginationService {
) {
whereConditions.push(`alias${i}.deprecated = FALSE`);
}
whereConditions.push(`alertType.user_active = TRUE`);
whereConditions.push(`alertType.master_active = TRUE`);

const whereClauseString =
whereConditions.length > 0 ? `${whereConditions.join(' AND ')}` : '';
Expand Down

0 comments on commit 020eeba

Please sign in to comment.