项目作者: gmasse

项目描述 :
Universal Time-Series Database Python Client (InfluxDB, Warp10, ...)
高级语言: Python
项目地址: git://github.com/gmasse/universal-tsdb.git
创建时间: 2020-04-04T16:52:52Z
项目社区:https://github.com/gmasse/universal-tsdb

开源协议:GNU General Public License v3.0

下载


universal-tsdb

A Universal Time-Series Database Python Client (InfluxDB, Warp10, …)

Introduction

This project aims to abstract your Time-Series backend, keeping your code as agnostic as possible.

Some examples:

  • proof of concept
  • early stages of development (when you are not sure which plateform you should use)
  • ETL (Extract-Transform-Load), for the load step

:warning: The current code only offer INGESTING functions (writing points to a backend).

Quickstart

Installation

  1. $ pip install universal-tsdb
  1. >>> from universal_tsdb import Client, Ingester
  2. >>> backend = Client('influx', 'http://localhost:8086', database='test')
  3. >>> series = Ingester(backend)
  4. >>> series.append(1585934895000, measurement='data', field1=42.0)
  5. >>> series.payload()
  6. 'data field1=42.0 1585934895000000000\n'
  7. >>> series.commit()

InfluxDB

  1. from universal_tsdb import Client, Ingester
  2. backend = Client('influx', 'http://localhost:8086', database='metrics',
  3. backend_username='user', backend_password='passwd')
  4. series = Ingester(backend)
  5. series.append(1585934895000, measurement='mes', field1=42.0)
  6. series.append(1585934896000, measurement='mes', tags={'tag1':'value1'}, field1=43.4, field2='value')
  7. series.commit()

The code above will generate a data payload based on InfluxDB line protocol
and send it via a HTTP(S) request.

  1. POST /write?db=metrics&u=user&p=passwd HTTP/1.1
  2. Host: localhost:8086
  3. mes field1=42.0 1585934895000000000
  4. mes,tag1=value1 field1=43.4 field2="value" 1585934896000000000

Warp10

  1. from universal_tsdb import Client, Ingester
  2. backend = Client('warp10', 'http://localhost/api/v0', token='WRITING_TOKEN_ABCDEF0123456789')
  3. series = Ingester(backend)
  4. series.append(1585934895000, field1=42.0)
  5. series.append(1585934896000, tags={'tag1':'value1'}, field1=43.4, field2='value')
  6. series.commit()

The code above will generate a data payload based on Warp10 GTS format
and send it via a HTTP(S) request.

  1. POST /api/v0/update HTTP/1.1
  2. Host: localhost
  3. X-Warp10-Token: WRITING_TOKEN_ABCDEF0123456789
  4. 1585934895000000// field1{} 42.0
  5. 1585934896000000// field1{tag1=value1} 42.0
  6. 1585934896000000// field2{tag1=value1} 'value'

Advanced Usage

Batch processing

When you have a large volume of data to send, you may want to split in several HTTP requests.
In ‘batch’-mode, the library commit (send) the data automatically:

  1. backend = Client('influx', 'http://localhost:8086', database='metrics')
  2. series = Ingester(backend, batch=10)
  3. for i in range(0..26):
  4. series.append(field=i)
  5. series.commit() # final commit to save the last 6 values
  1. Commit#1 Sent 10 new series (total: 10) in 0.02 s @ 2000.0 series/s (total execution: 0.13 s)
  2. Commit#2 Sent 10 new series (total: 20) in 0.02 s @ 2000.0 series/s (total execution: 0.15 s)
  3. Commit#3 Sent 6 new series (total: 26) in 0.01 s @ 2000.0 series/s (total execution: 0.17 s)
  4. REPORT: 3 commits (3 successes), 26 series, 26 values in 0.17 s @ 2000.0 values/s",

Omitting Timestamp

If you omit timestamp, the library uses the function time.time()
to generate a UTC Epoch Time. Precision is system dependent.

Measurement in Warp10

InfluxDB measurement does not exist in Warp10.
The library emulates measurement by prefixing the Warp10 classname:

  1. backend = Client('warp10', token='WRITING_TOKEN_ABCDEF0123456789')
  2. series = Ingester(backend)
  3. series.append(1585934895000, measurement='mes', field1=42.0)
  4. series.commit()
  1. 1585934896000000// mes.field1{} 42.0

Todo

  • API documentation
  • Examples
  • Data query/fetch functions
  • Refactoring of backend specific code (inherited classes?)
  • Time-Series Line protocol optimization
  • Gzip/deflate HTTP compression
  • Code coverage / additional tests