Weightlifting@
@bogdan
@SIMPLYSOCIAL
+ =
+ =
+ =?
+ =
ARCHITECTURE
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
LIFE OF A REQUEST
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 01 API N...
APP 01 APP 01 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
API 02
APP 02
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
LIFE OF A REQUEST
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
LIFE OF A REQUEST
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
LIFE OF A REQUEST
2M x
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
LIFE OF A REQUEST
USE OF PYTHON
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
CAN I HEAR A BOOO FOR ?
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
PYTHON ON APPSERVERS
AWS EC2
Ubuntu 12.04
Chef
NGINX + PHP FPM
FABRIC
DEPLOYMENT
Metrics
Logging
USE OF PYTHON
METRICS
AWS EC2
Ubuntu 12.04
Chef
NGINX + PHP FPM
FABRIC
Metrics
Logging
INTRODUCING OSSIE
METRICS GATHERING
REPORTING STATS THROUGH A SIMPLE WEB INTERFACE
INTERACTION WITH RUNNING PROcesses
BACKEND BY STATSD and GRAPHITE
INTRODUCING OSSIE
import ossie
stat = ossie.Client("arcade.random_worker")
stat.gauge("a_metric", 256)
stat.timer("random.metric", 1.28)
with stat.timer("random.metric.with"): # do something that requires an amount of time # ossie client will report the right time
USE OF PYTHON
logging
AWS EC2
Ubuntu 12.04
Chef
NGINX + PHP FPM
FABRIC
Metrics
Logging
logginG
FULL JSON logging
EVEN nginx logs json
powered by fluentd + S3
FUTURE: Logs can be searched by elasticsearch + kibana
api.gosimplysocial.com
app.gosimplysocial.com
API 01 API 02 API N...
APP 01 APP 02 APP N...
BACKEND01
BACKEND02
BACKENDN...
DBs
USE OF PYTHON
AWS EC2
Ubuntu 12.04
Chef
FABRIC
SUPERVISORD
Metrics
SUPERVISORD
Workers
Logging
AWS EC2
Ubuntu 12.04
Chef
FABRIC
WORKERS
Metrics
SUPERVISORD
Workers
Logging
INTRODUCING ARCADE
queue based workers framework
Convention over configuration
simple deployment
detailed logging and monitoring
INTRODUCING ARCADE
CRAWL INSIGHTS
CRAWL PAGES
CRAWL EVENTS
ARCADEFACEBOOKCRAWL
INTRODUCING ARCADE
CRAWL INSIGHTS
CRAWL PAGES
CRAWL EVENTS
ARCADEFACEBOOKCRAWL
queue pythonworkers
job types
INTRODUCING ARCADEfacebook/!"" __init__.py!"" common# !"" __init__.py# !"" crawler/# !"" exception.py# !"" facebook_model.py# !"" helper.py# !"" indexer/# !"" ...!"" config# !"" __init__.py# !"" config.yml# $"" env# !"" __init__.py# !"" crawler.py# !"" ......
!"" tests# !"" __init__.py# !"" fixtures# # !"" crawler_feed.yml# # !"" ...# # $"" stream_routing.yml# !"" test_crawler_feed.py# !"" ...# $"" test_stream_worker.py$"" workers !"" __init__.py !"" crawler.py !"" ...
from arcade.worker import TaskWorkerfrom arcade.helper import retry, task
# [...]
class FacebookCrawler(TaskWorker):
def __init__(self, dependencies): super(FacebookCrawler, self).__init__(dependencies)
@task("default", "crawl") def crawl_feed(self, data): """ Job type: * uid * last_crawl """ feed = CrawlerFeedModel(self) feed.crawl(data)
def before_each(self, data): # [...]
def run(dependencies): try: FacebookCrawler(dependencies).run() except: dependencies.get("exception").capture_exception()
AND MANY MORE:
continuous integration
RELIABLE ALERTING
CLOUDWATCH to STATSD
REAL USER MONITORING
Q/A
Thanks!
Top Related