Add more questions

This commit is contained in:
abregman
2019-10-05 16:43:54 +03:00
parent c5c26be24c
commit 7d89739a14
4 changed files with 70 additions and 16 deletions

View File

@@ -0,0 +1,6 @@
## Ansible, Minikube and Docker
* Write a simple program in any language you want that outputs "I'm on <HOSTNAME>"
* Write a Dockerfile which will run your app
* Create the YAML files required for deploying the pods
* Write and run an Ansible playbook which will install Docker, Minikube and kubectl and then create a deployment in minikube with your app running.

View File

@@ -15,3 +15,7 @@ Please describe in detail:
- How you apply security policy for access control
- How you transfer the logs from the app to ELK
- How you deal with multi apps running in different regions
## Solution
One Possible solution can be found [here](solutions/elk_kibana_aws.md)

View File

@@ -0,0 +1,22 @@
# Elasticsearch, Kibana and AWS - Solution
This one out of many possible solutions. This solution is relying heavily on AWS.
* Create a VPC with subnet so we can place Elasticsearch node(s) in internal environment only.
If required, we will also setup NAT for public access.
* Create an IAM role for the access to the cluster. Also, create a separate role for admin access.
* To provision the solution quickly, we will use the elasticsearch service directly from AWS for production deployment.
This way we also cover multiple AZs. As for authentication, we either use Amazon cognito or the organization LDAP server.
* To transfer data, we will have to install logstash agent on the instances. The agent will be responsible
for pushing the data to elasticsearch.
* For monitoring we will use:
* Cloud watch to monitor cluster resource utilization
* Cloud metrics dashboard
* If access required from multiple regions we will transfer all the data to S3 which will allow us to view the data
from different regions and consolidate it in one dashboard