Вы находитесь на странице: 1из 4

Project requirements

The scope of this project is to create a Python microservice using the eventify framework to
collect the following datapoints from the Google Compute Engine API documented here:

https://cloud.google.com/compute/docs/api/how-tos/authorization

https://cloud.google.com/compute/docs/reference/latest/regions/list

https://cloud.google.com/compute/docs/reference/latest/machineTypes/list

The following datapoints are formatted in the current format:

{
name: name of the data point we would like to collect for a given machine
type
description: description of the data point
type_title: the underlying data type the point should be collected in
}

All of the data points that should be collected are here:

https://bpaste.net/show/044ae52dc5bb

It should be noted that not all data points may be available for each Google Compute
Engine machine type.

To determine which data is available it is recommended to review the following


documentation:

https://cloud.google.com/compute/docs/machine-types

https://cloud.google.com/compute/docs/reference/latest/machineTypes#resource

https://cloud.google.com/compute/pricing#machinetype

https://cloud.google.com/compute/docs/cpu-platforms

https://cloud.google.com/compute/docs/disks/

https://cloud.google.com/compute/docs/regions-zones/regions-zones

There may be other relevant documentation available via the Google Compute Engine
website.

The goal of this project is to collect all of available data points from each region of Google
Compute Engine via API and normalize the data into the names of the data points per the
JSON document above. For each normalized machine type it should then be published by
creating a new event using the eventify framework which is detailed below.
General software requirements
Use the eventify module (https://github.com/eventifyio/eventify)
Use Python 3.6 (https://www.python.org/downloads/release/python-362/)

Middleware availability
All required middleware has been setup and is running and available via socket.opless.io.
This is ephemeral middleware and will not store any data to disk but is provided to you to
enhance development efforts. The configuration of this middleware is provided below
during the boilerplate section.

Boilerplate code
The following code examples are setup to assist you in the development of the project
described above.

1. Setup environment variables


export EVENT_DB_USER=postgres
export EVENT_DB_PASS=testpassword
export EVENT_DB_HOST=socket.opless.io
2. Files to start and help development

config.json This is the basic configuration used by the eventify module


this should require no modifications
{
"_comment": "service configuration",
"name": "gce-vm-collector",
"image": "gce/vm",
"driver": "crossbar",
"transport_host": "ws://socket.opless.io:8080/ws",
"pub_options": {
"acknowledge": true,
"retain": false,
"exclude_me": true
},
"publish_topic": {
"topic": "gce-vm",
"timeout": 20,
"reply_in": 0
},
"subscribed_topics": [
"gce-vm"
],
"sub_options": {
"get_retained": false
},
"replay_events": false,
"replay_type": "event_store"
}
service.py this is the entrypoint of the application
this should require no modifications
import logging

from eventify.service import Service


from service.handler import Collector
logging.basicConfig(level=logging.DEBUG)
logger = logging.getLogger('gce.vm.collector')

def run():
"""
Run an eventify service
"""
Service(
config_file='config.json',
handlers=[Collector]
).start()

if __name__ == '__main__':
run()

service/handler.py this is a registered callback via eventify


You should be implementing the business logic for this service within the GoogleCollector class.
import asyncio

from eventify.base_handler import BaseHandler


from eventify.event import Event

class GoogleCollector:
"""
Google Specific Collector
"""
async def collect_vm_data(self):
print('...collecting data from gce api...')
await asyncio.sleep(1)

class Collector(BaseHandler, GoogleCollector):


"""
Generic collector
"""

def __init__(self):
"""
Service initialization
"""
print('...service initialized...')

async def worker(self):


"""
Collect data for required data points
https://bpaste.net/show/044ae52dc5bb
"""
print('...working...')
await self.collect_vm_data()
Relevant tutorials

1. How to publish an event using eventify

from eventify.event import Event

async def found_new_machine_type(self, normalized_data):


new_event = Event({
name: GceMachineDiscovered,
message: normalized_data
})
await self.emit_event(new_event)

Acceptance criteria

Upon review of the provided source code and verification that the code properly collects the data points
available from Google Compute Engine API the following will be checked:
1. Pylint scoring
2. Provided code uses eventify
3. Provided code publishes events to the middleware provided
4. Manual check that the data received is correct based on what is documented in the ui for the
given machine types.

Вам также может понравиться