Skip to content

Commit

Permalink
initial commit and update of project
Browse files Browse the repository at this point in the history
  • Loading branch information
Claire Hodge authored and Claire Hodge committed Jul 7, 2024
1 parent 646c177 commit d506580
Show file tree
Hide file tree
Showing 8 changed files with 79 additions and 1 deletion.
Binary file added .DS_Store
Binary file not shown.
30 changes: 30 additions & 0 deletions .github/workflows/deploy.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
name: Publish to GitHub Pages
on:
push:
branches:
- main

jobs:
deploy:
runs-on: ubuntu-latest

steps:
- name: Checkout repository
uses: actions/checkout@v2

- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.x'

- name: Install dependencies
run: pip install mkdocs

- name: Build MkDocs site
run: mkdocs build --clean

- name: Deploy to GitHub Pages
uses: peaceiris/actions-gh-pages@v3
with:
github_token: ${{ secrets.MKDOCS_TOKEN }}
publish_dir: ./site
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
site
16 changes: 15 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1 +1,15 @@
# event-driven-arch
# How to Run This Site

This site is built with mkdocs here are the steps to replicate and install this site to view the documentation locally.

1. Have python and pip installed on your system
2. Run `pip install mkdocs` in terminal
3. Run `mkdocs serve` in terminal
4. From the terminal copy the IP address that it is being served on.
For example it is usually something like:
```
Serving on http://127.0.0.1:8000/
```
5. Now you should be able to view the site locally.

To deploy the site to Github Pages, simply create a PR and then push to main. Github Actions are setup that will automatically deploy the site to Github Pages.
Binary file added docs/img/pub-sub-one-region.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
5 changes: 5 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
# Jetty System Design Interview
<br>
## Welcome!

This is a documentation site that will fully explain the system that I have modeled. Please feel free to navigate the site and check out all the documentation. If you have any further questions, please feel free to email me at `mclairehodge@gmail.com` and checkout the [github repo](https://github.com/mclaireh/event-driven-arch).
21 changes: 21 additions & 0 deletions docs/pub-sub.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Pub-Sub Eventing System
<br><br>
## Pub-Sub Eventing Diagram
![pub-sub-system](../img/pub-sub-one-region.png)

## Summary
This is an example of a pub-sub eventing system.. This diagram is only for `us-east-1` (or any other single region) for simplicity. Third party publishers push events to the `API Gateway`. These events have payloads that are processed by the `Ingestion Gateway Lambda` to go to `DynamoDB`. The `Ingestion Gateway Lambda` also has a `SQS DLQ`. If for some reason the events cannot make it to the `DynamoDB` they are sent to the `DLQ` where they await reprocessing. There is a `CloudWatch Alarm` on the `DQL` so that I know when events are there and can reprocess them quickly. This also allows me to investigate and find out if there are any issues that I need to resolve that would have caused the failure. The `Ingestion Gateway Lambda` also pushes the processed payloads from the `API Gateway` to a `Macie S3 bucket`. This `S3` bucket is where `AWS Macie` will scan and look to see if there is any sensitive information present in the saved payloads. If there is sensitive information present in the payloads, then it will alert and I will speak with the publishers to stop that from happening further.

Events are saved in `DynamoDB` and then `DynamoDB Streams` begins to be created. These streams invoke a `Publish SNS lambda`. This `lambda` pushes the events from `DynamoDB streams` to a specific `SNS topic` depending on the payload. `Publish SNS lambda` has a `SQS DLQ` also. If there is a failure to push to `SNS` then the events go to the `DLQ`. This `DLQ` has a `CloudWatch Alarm` set up so that if there are any events on the `DLQ` it will alert me. This is again so that I can replay the events quickly and be alerted of any potential issues that need to be resolved before replaying the events. There are multiple `SNS topics` and a subscriber can subscribe to as many as they desire. These `SNS topics` push the events to the subscriber so that the subscriber can do what they need to do with these events (such as sending a text message to the customer).

## Deep Dive

I would like to deep dive into a couple of the resources to give more detail of the design considerations.

### API Gateway

I authenticated the publishers by using API keys. Each publisher had a single API Key that was unique to them. This allowed me to revoke API keys and thus access if needed. It also guaranteed that only those who had been granted API keys could publish to the system. This helped to lock down the system from potential abuse. I also assigned each publisher a usage plan. Having a usage plan for each publisher allowed me to throttle if the publisher began publish something against company rules, such as sensitive customer info.

### DynamoDB Global Tables

The `DynamoDB` tables are global tables. This allows for 99.999% availability. It also automatically replicates the data between regions, so there is no need to have to write scripts or code to setup the replication. This can also play a key role in disaster recovery between regions.
7 changes: 7 additions & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
site_name: Jetty System Design
site_url: https://mclaireh.github.io/event-drive-arch/
theme:
name: readthedocs
nav:
- Home: index.md
- Pub-Sub Eventing System: pub-sub.md

0 comments on commit d506580

Please sign in to comment.