Add new challenge

This commit is contained in:
abregman
2019-12-21 13:26:41 +02:00
parent 0f871e9b9e
commit 979d11d13f
17 changed files with 188 additions and 13 deletions

View File

@@ -0,0 +1,6 @@
## Ansible, Minikube and Docker
* Write a simple program in any language you want that outputs "I'm on %HOSTNAME%" (HOSTNAME should be the actual host name on which the app is running)
* Write a Dockerfile which will run your app
* Create the YAML files required for deploying the pods
* Write and run an Ansible playbook which will install Docker, Minikube and kubectl and then create a deployment in minikube with your app running.

View File

@@ -0,0 +1,12 @@
## CI for Open Source Project
1. Choose an open source project from Github and fork it
2. Create a CI pipeline/workflow for the project you forked
3. The CI pipeline/workflow will include anything that is relevant to the project you forked. For example:
* If it's a Python project, you will run PEP8
* If the project has unit tests directory, you will run these unit tests as part of the CI
4. In a separate file, describe what is running as part of the CI and why you chose to include it. You can also describe any thoughts, dilemmas, challenge you had
### Bonus
Containerize the app of the project you forked using any containerization technology you want.

View File

@@ -0,0 +1,19 @@
## Cloud Slack Bot
Create a slack bot to manage cloud instances. You can choose whatever cloud provider you want (e.g. Openstack, AWS, GCP, Azure)
You should provide:
* Instructions on how to use it
* Source code of the slack bot
* A running slack bot account or a deployment script so we can test it
The bot should be able to support:
* Creating new instances
* Removing existing instances
* Starting an instance
* Stopping an instance
* Displaying the status of an instance
* List all available instances
The bot should also be able to show help message.

View File

@@ -0,0 +1,21 @@
# Elasticsearch, Kibana and AWS
Your task is to build an elasticsearch cluster along with Kibana dashboard on one of the following clouds:
* AWS
* OpenStack
* Azure
* GCP
You have to describe in details (preferably with some drawings) how you are going to set it up.
Please describe in detail:
- How you scale it up or down
- How you quickly (less 20 minutes) provision the cluster
- How you apply security policy for access control
- How you transfer the logs from the app to ELK
- How you deal with multi apps running in different regions
## Solution
One Possible solution can be found [here](solutions/elk_kibana_aws.md)

View File

@@ -0,0 +1,34 @@
Your mission, should you choose to accept it, involves fixing the app in this directory, containerize it and set up a CI for it.
Please read carefully all the instructions.
## Installation
1. Create a virtual environment with `python3 -m venv challenge_venv`
2. Activate it with `source challenge_venv/bin/activate`
3. Install requirements.txt `pip install -r requirements.txt`
## Run the app
1. Run `export FLASK_APP=app/app.py`
1. To run the app execute `flask run`. If it doesn't works, fix it
## Containers
Using Docker or Podman, containerize the flask app so users can run the following two commands:
```
docker build -t app:latest /path/to/Dockerfile
docker run -d -p 5000:5000 app
```
1. You can use any image base you would like
2. Containrize only what you need for running the application, nothing else.
## CI
Great, now that we have a working app and also can run it in a container, let's set up a CI for it so it won't break again in the future
1. The CI should run the tests in the app directory
2. There should be some kind of test for the Dockerfile you wrote
2. Add additional unit test (or another level of tests)

View File

@@ -0,0 +1,9 @@
#!/usr/bin/env python
# coding=utf-8
from flask import Flask
app = Flask(__name__)
@app.route('/')
def hello_world():
return 'Hello, World!'

View File

@@ -0,0 +1,12 @@
#!/usr/bin/env python
# coding=utf-8
import os
basedir = os.path.abspath(os.path.dirname(__file__))
SECRET_KEY = 'shhh'
CSRF_ENABLED = True
SQLALCHEMY_DATABASE_URI = 'sqlite:///' + os.path.join(basedir, 'app.db')

View File

@@ -0,0 +1,25 @@
#!/usr/bin/env python
# coding=utf-8
import os
import unittest
from config import basedir
from app import app
from app import db
class TestCase(unittest.TestCase):
def setUp(self):
app.config['TESTING'] = True
app.config['WTF_CSRF_ENABLED'] = False
app.config['SQLALCHEMY_DATABASE_URI'] = 'sqlite:///' + os.path.join(basedir, 'test.db')
self.app = app.test_client()
db.create_all()
def tearDown(self):
db.session.remove()
db.drop_all()
if __name__ == '__main__':
unittest.main()

View File

@@ -0,0 +1,6 @@
def jobs = Jenkins.instance.items.findAll { job -> job.name =~ /"REMOVE_ME"/ }
jobs.each { job ->
println job.name
//job.delete()
}

View File

@@ -0,0 +1,16 @@
def removeOldBuilds(buildDirectory, days = 14) {
def wp = new File("${buildDirectory}")
def currentTime = new Date()
def backTime = currentTime - days
wp.list().each { fileName ->
folder = new File("${buildDirectory}/${fileName}")
if (folder.isDirectory()) {
def timeStamp = new Date(folder.lastModified())
if (timeStamp.before(backTime)) {
folder.delete()
}
}
}
}

View File

@@ -0,0 +1,10 @@
## Jenkins Pipelines
Write/Create the following Jenkins pipelines:
* A pipeline which will run unit tests upon git push to a certain repository
* A pipeline which will do to the following:
* Provision an instance (can also be a container)
* Configure the instance as Apache web server
* Deploy a web application on the provisioned instance

View File

@@ -0,0 +1,11 @@
## Jenkins Scripts
Write the following scripts:
* Remove all the jobs which include the string "REMOVE_ME" in their name
* Remove builds older than 14 days
### Answer
* [Remove jobs which include specific string](jenkins/scripts/jobs_with_string.groovy)
* [Remove builds older than 14 days](jenkins/scripts/old_builds.groovy)

View File

@@ -0,0 +1,22 @@
# Elasticsearch, Kibana and AWS - Solution
This one out of many possible solutions. This solution is relying heavily on AWS.
* Create a VPC with subnet so we can place Elasticsearch node(s) in internal environment only.
If required, we will also setup NAT for public access.
* Create an IAM role for the access to the cluster. Also, create a separate role for admin access.
* To provision the solution quickly, we will use the elasticsearch service directly from AWS for production deployment.
This way we also cover multiple AZs. As for authentication, we either use Amazon cognito or the organization LDAP server.
* To transfer data, we will have to install logstash agent on the instances. The agent will be responsible
for pushing the data to elasticsearch.
* For monitoring we will use:
* Cloud watch to monitor cluster resource utilization
* Cloud metrics dashboard
* If access required from multiple regions we will transfer all the data to S3 which will allow us to view the data
from different regions and consolidate it in one dashboard

View File

@@ -0,0 +1,11 @@
# Write a Dockerfile and run a container
Your task is as follows:
1. Create a Docker image:
* Use centos or ubuntu as the base image
* Install apache web server
* Deploy any web application you want
* Add https support (using HAProxy as reverse-proxy)))
2. Once you wrote the Dockerfile and created an image, run the container and test the application. Describe how did you test it and provide output
3. Describe one or more weakness of your Dockerfile. Is it ready to be used in production?