[Plugin] [HTTP] Support ceresdb

This commit is contained in:
qianmoQ 2023-04-12 17:31:42 +08:00
parent bc3eb49e97
commit 3c737b1812
22 changed files with 631 additions and 3 deletions

View File

@ -151,6 +151,9 @@ Here are some of the major database solutions that are supported:
</a>&nbsp; </a>&nbsp;
<a href="https://kafka.apache.org" target="_blank"> <a href="https://kafka.apache.org" target="_blank">
<img src="docs/docs/assets/plugin/kafka.png" alt="Apache Kafka" height="50" /> <img src="docs/docs/assets/plugin/kafka.png" alt="Apache Kafka" height="50" />
</a>&nbsp;
<a href="https://docs.ceresdb.io/" target="_blank">
<img src="docs/docs/assets/plugin/ceresdb.png" alt="CeresDB" height="50" />
</a> </a>
</p> </p>

View File

@ -315,6 +315,11 @@
<artifactId>datacap-native-h2</artifactId> <artifactId>datacap-native-h2</artifactId>
<version>${project.version}</version> <version>${project.version}</version>
</dependency> </dependency>
<dependency>
<groupId>io.edurt.datacap</groupId>
<artifactId>datacap-http-ceresdb</artifactId>
<version>${project.version}</version>
</dependency>
<!-- Executor --> <!-- Executor -->
<dependency> <dependency>
<groupId>io.edurt.datacap</groupId> <groupId>io.edurt.datacap</groupId>

View File

@ -0,0 +1,58 @@
name: CeresDB
supportTime: '2023-04-12'
configures:
- field: name
type: String
required: true
message: name is a required field, please be sure to enter
- field: host
type: String
required: true
value: 127.0.0.1
message: host is a required field, please be sure to enter
- field: port
type: Number
required: true
min: 1
max: 65535
value: 5440
message: port is a required field, please be sure to enter
pipelines:
- executor: Seatunnel
type: SOURCE
fields:
- field: host
origin: host|port
required: true
- field: database
origin: database
required: true
- field: sql
origin: context
required: true
- field: username
origin: username
required: true
- field: password
origin: password
required: true
- executor: Seatunnel
type: SINK
fields:
- field: host
origin: host|port
required: true
- field: database
origin: database
required: true
- field: sql
origin: context
required: true
- field: username
origin: username
required: true
- field: password
origin: password
required: true

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 128 KiB

View File

@ -181,6 +181,9 @@ Datacap is fast, lightweight, intuitive system.
</a>&nbsp; </a>&nbsp;
<a href="https://kafka.apache.org" target="_blank" class="connector-logo-index"> <a href="https://kafka.apache.org" target="_blank" class="connector-logo-index">
<img src="/assets/plugin/kafka.png" alt="Apache Kafka" height="50" /> <img src="/assets/plugin/kafka.png" alt="Apache Kafka" height="50" />
</a>&nbsp;
<a href="https://docs.ceresdb.io/" target="_blank" class="connector-logo-index">
<img src="/assets/plugin/ceresdb.png" alt="CeresDB" height="50" />
</a> </a>
</p> </p>

View File

@ -181,6 +181,9 @@ Datacap 是快速、轻量级、直观的系统。
</a>&nbsp; </a>&nbsp;
<a href="https://kafka.apache.org" target="_blank" class="connector-logo-index"> <a href="https://kafka.apache.org" target="_blank" class="connector-logo-index">
<img src="/assets/plugin/kafka.png" alt="Apache Kafka" height="50" /> <img src="/assets/plugin/kafka.png" alt="Apache Kafka" height="50" />
</a>&nbsp;
<a href="https://docs.ceresdb.io/" target="_blank" class="connector-logo-index">
<img src="/assets/plugin/ceresdb.png" alt="CeresDB" height="50" />
</a> </a>
</p> </p>

View File

@ -0,0 +1,44 @@
---
title: CeresDB
status: new
---
<img src="/assets/plugin/ceresdb.png" class="connector-logo" />
#### What is CeresDB ?
CeresDB is a high-performance, distributed, cloud native time-series database.
#### Environment
!!! note
If you need to use this data source, you need to upgrade the DataCap service to >= `1.9.x`
Support Time: `2023-04-12`
#### Configure
---
!!! note
If your CeresDB service version requires other special configurations, please refer to modifying the configuration file and restarting the DataCap service.
=== "Configure"
| Field | Required | Default Value |
|:------:|:---------------------------------:|:-------------:|
| `Name` | :material-check-circle: { .red } | - |
| `Host` | :material-check-circle: { .red } | `127.0.0.1` |
| `Port` | :material-check-circle: { .red } | `5440` |
#### Version (Validation)
---
!!! warning
The online service has not been tested yet, if you have detailed test results, please submit [issues](https://github.com/EdurtIO/datacap/issues/new/choose) to us
- [x] 1.x

View File

@ -0,0 +1,44 @@
---
title: CeresDB
status: new
---
<img src="/assets/plugin/ceresdb.png" class="connector-logo" />
#### 什么是 CeresDB ?
CeresDB 是一款高性能、分布式的云原生时序数据库。
#### 环境
!!! note
如果你需要使用这个数据源, 您需要将 DataCap 服务升级到 >= `1.9.x`
支持时间: `2023-04-12`
#### 配置
---
!!! note
如果您的CeresDB服务版本需要其他特殊配置请参考修改配置文件并重启 DataCap 服务。
=== "配置"
| Field | Required | Default Value |
|:------:|:---------------------------------:|:-------------:|
| `Name` | :material-check-circle: { .red } | - |
| `Host` | :material-check-circle: { .red } | `127.0.0.1` |
| `Port` | :material-check-circle: { .red } | `5440` |
#### 版本(验证)
---
!!! warning
服务版本尚未测试,如果您有详细的测试并发现错误,请提交 [issues](https://github.com/EdurtIO/datacap/issues/new/choose)
- [x] 1.x

View File

@ -1,6 +1,5 @@
--- ---
title: H2 Database title: H2 Database
status: new
--- ---
<img src="/assets/plugin/h2.png" class="connector-logo" /> <img src="/assets/plugin/h2.png" class="connector-logo" />

View File

@ -1,6 +1,5 @@
--- ---
title: DataCap in a Docker container title: DataCap in a Docker container
status: new
--- ---
The DataCap project provides the [qianmoq/datacap](https://hub.docker.com/r/qianmoq/datacap) Docker image that includes the DataCap server and a default configuration. The Docker image is published to Docker Hub and can be used with the Docker runtime, among several others. The DataCap project provides the [qianmoq/datacap](https://hub.docker.com/r/qianmoq/datacap) Docker image that includes the DataCap server and a default configuration. The Docker image is published to Docker Hub and can be used with the Docker runtime, among several others.

View File

@ -165,6 +165,7 @@ nav:
- Zookeeper: reference/connectors/native/zookeeper.md - Zookeeper: reference/connectors/native/zookeeper.md
- Redis: reference/connectors/native/redis.md - Redis: reference/connectors/native/redis.md
- Http: - Http:
- CeresDB: reference/connectors/http/ceresdb.md
- ClickHouse: reference/connectors/http/clickhouse.md - ClickHouse: reference/connectors/http/clickhouse.md
- Developer guide: - Developer guide:
- Development environment: developer_guide/env.md - Development environment: developer_guide/env.md

View File

@ -0,0 +1,75 @@
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>io.edurt.datacap</groupId>
<artifactId>datacap</artifactId>
<version>1.9.0-SNAPSHOT</version>
<relativePath>../../pom.xml</relativePath>
</parent>
<artifactId>datacap-http-ceresdb</artifactId>
<description>DataCap - CeresDB</description>
<properties>
<plugin.name>http-ceresdb</plugin.name>
</properties>
<dependencies>
<dependency>
<groupId>io.edurt.datacap</groupId>
<artifactId>datacap-spi</artifactId>
<scope>provided</scope>
</dependency>
<dependency>
<groupId>io.edurt.datacap</groupId>
<artifactId>datacap-common</artifactId>
</dependency>
<dependency>
<groupId>commons-beanutils</groupId>
<artifactId>commons-beanutils</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<scope>test</scope>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>${assembly-plugin.version}</version>
<configuration>
<finalName>${plugin.name}</finalName>
<descriptors>
<descriptor>../../configure/assembly/assembly-plugin.xml</descriptor>
</descriptors>
<outputDirectory>../../dist/plugins/${plugin.name}</outputDirectory>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
</project>

View File

@ -0,0 +1,92 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import io.edurt.datacap.spi.adapter.HttpAdapter;
import io.edurt.datacap.spi.connection.HttpConfigure;
import io.edurt.datacap.spi.connection.HttpConnection;
import io.edurt.datacap.spi.connection.http.HttpClient;
import io.edurt.datacap.spi.connection.http.HttpMethod;
import io.edurt.datacap.spi.json.JSON;
import io.edurt.datacap.spi.model.Response;
import io.edurt.datacap.spi.model.Time;
import lombok.extern.slf4j.Slf4j;
import okhttp3.MediaType;
import org.apache.commons.beanutils.BeanUtils;
import org.apache.commons.lang3.ObjectUtils;
import java.util.ArrayList;
import java.util.Date;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.stream.Collectors;
@Slf4j
public class CeresDBAdapter
extends HttpAdapter
{
public CeresDBAdapter(HttpConnection connection)
{
super(connection);
}
@Override
public Response handlerExecute(String content)
{
Time processorTime = new Time();
processorTime.setStart(new Date().getTime());
Response response = this.httpConnection.getResponse();
HttpConfigure configure = new HttpConfigure();
if (response.getIsConnected()) {
List<String> headers = new ArrayList<>();
List<String> types = new ArrayList<>();
List<Object> columns = new ArrayList<>();
try {
BeanUtils.copyProperties(configure, this.httpConnection.getConfigure());
configure.setAutoConnected(Boolean.FALSE);
configure.setRetry(0);
configure.setMethod(HttpMethod.POST);
configure.setPath("sql");
Map<String, String> map = new ConcurrentHashMap<>();
map.put("query", content);
configure.setJsonBody(JSON.toJSON(map));
configure.setMediaType(MediaType.parse("application/json; charset=utf-8"));
HttpConnection httpConnection = new HttpConnection(configure, new Response());
HttpClient httpClient = HttpClient.getInstance(configure, httpConnection);
String body = httpClient.execute();
CeresDBResponse ceresDBResponse = JSON.objectmapper.readValue(body, CeresDBResponse.class);
if (ceresDBResponse.getCode() == 0 || ObjectUtils.isNotEmpty(ceresDBResponse.getRows())) {
response.setIsSuccessful(true);
for (int i = ceresDBResponse.getRows().size() - 1; i >= 0; i--) {
Map<String, String> row = ceresDBResponse.getRows().get(i);
if (i == 0) {
headers.addAll(row.keySet());
// TODO: check type
}
List<Object> _columns = row.entrySet()
.stream()
.map(entry -> entry.getValue())
.collect(Collectors.toList());
columns.add(handlerFormatter(configure.getFormat(), headers, _columns));
}
}
else {
response.setIsSuccessful(Boolean.FALSE);
response.setMessage(ceresDBResponse.getMessage());
}
}
catch (Exception ex) {
log.error("Execute content failed content {} exception ", content, ex);
response.setIsSuccessful(Boolean.FALSE);
response.setMessage(ex.getMessage());
}
finally {
response.setHeaders(headers);
response.setTypes(types);
response.setColumns(columns);
}
}
processorTime.setEnd(new Date().getTime());
response.setProcessor(processorTime);
return response;
}
}

View File

@ -0,0 +1,81 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import io.edurt.datacap.spi.Plugin;
import io.edurt.datacap.spi.PluginType;
import io.edurt.datacap.spi.connection.HttpConfigure;
import io.edurt.datacap.spi.connection.HttpConnection;
import io.edurt.datacap.spi.model.Configure;
import io.edurt.datacap.spi.model.Response;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.beanutils.BeanUtils;
import org.apache.commons.lang3.ObjectUtils;
@Slf4j
public class CeresDBPlugin
implements Plugin
{
private HttpConfigure configure;
private HttpConnection connection;
private Response response;
@Override
public String validator()
{
return "SELECT 1";
}
@Override
public String name()
{
return "CeresDB";
}
@Override
public String description()
{
return "Integrate CeresDB data sources";
}
@Override
public PluginType type()
{
return PluginType.HTTP;
}
@Override
public void connect(Configure configure)
{
try {
this.response = new Response();
this.configure = new HttpConfigure();
BeanUtils.copyProperties(this.configure, configure);
this.connection = new HttpConnection(this.configure, this.response);
}
catch (Exception ex) {
this.response.setIsConnected(Boolean.FALSE);
this.response.setMessage(ex.getMessage());
}
}
@Override
public Response execute(String content)
{
if (ObjectUtils.isNotEmpty(this.connection)) {
log.info("Execute ceresdb plugin logic started");
this.response = this.connection.getResponse();
CeresDBAdapter processor = new CeresDBAdapter(this.connection);
this.response = processor.handlerExecute(content);
log.info("Execute ceresdb plugin logic end");
}
this.destroy();
return this.response;
}
@Override
public void destroy()
{
if (ObjectUtils.isNotEmpty(this.connection)) {
this.connection.destroy();
}
}
}

View File

@ -0,0 +1,38 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import com.google.inject.multibindings.Multibinder;
import io.edurt.datacap.spi.AbstractPluginModule;
import io.edurt.datacap.spi.Plugin;
import io.edurt.datacap.spi.PluginModule;
import io.edurt.datacap.spi.PluginType;
public class CeresDBPluginModule
extends AbstractPluginModule
implements PluginModule
{
@Override
public String getName()
{
return "CeresDB";
}
@Override
public PluginType getType()
{
return PluginType.HTTP;
}
@Override
public AbstractPluginModule get()
{
return this;
}
protected void configure()
{
Multibinder<String> module = Multibinder.newSetBinder(this.binder(), String.class);
module.addBinding().toInstance(this.getClass().getSimpleName());
Multibinder<Plugin> plugin = Multibinder.newSetBinder(this.binder(), Plugin.class);
plugin.addBinding().to(CeresDBPlugin.class);
}
}

View File

@ -0,0 +1,22 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import com.fasterxml.jackson.annotation.JsonIgnoreProperties;
import lombok.AllArgsConstructor;
import lombok.Data;
import lombok.NoArgsConstructor;
import lombok.ToString;
import java.util.List;
import java.util.Map;
@Data
@ToString
@NoArgsConstructor
@AllArgsConstructor
@JsonIgnoreProperties(ignoreUnknown = true)
public class CeresDBResponse
{
private int code;
private String message;
private List<Map<String, String>> rows;
}

View File

@ -0,0 +1 @@
io.edurt.datacap.plugin.natived.ceresdb.CeresDBPluginModule

View File

@ -0,0 +1,56 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import org.testcontainers.containers.GenericContainer;
import org.testcontainers.utility.DockerImageName;
import java.net.Inet4Address;
import java.net.InetAddress;
import java.net.NetworkInterface;
import java.net.SocketException;
import java.util.Enumeration;
public class CeresDBContainer
extends GenericContainer<CeresDBContainer>
{
private static final DockerImageName DEFAULT_IMAGE_NAME =
DockerImageName.parse("ceresdb/ceresdb-server");
public static final int PORT = 8831;
public static final int HTTP_PORT = 5440;
public CeresDBContainer()
{
super(DEFAULT_IMAGE_NAME);
withExposedPorts(PORT, HTTP_PORT);
}
public CeresDBContainer(final DockerImageName dockerImageName)
{
super(dockerImageName);
dockerImageName.assertCompatibleWith(dockerImageName);
withExposedPorts(PORT, HTTP_PORT);
}
public String getLinuxLocalIp()
{
String ip = "";
try {
Enumeration<NetworkInterface> networkInterfaces =
NetworkInterface.getNetworkInterfaces();
while (networkInterfaces.hasMoreElements()) {
NetworkInterface networkInterface = networkInterfaces.nextElement();
Enumeration<InetAddress> inetAddresses = networkInterface.getInetAddresses();
while (inetAddresses.hasMoreElements()) {
InetAddress inetAddress = inetAddresses.nextElement();
if (!inetAddress.isLoopbackAddress() && inetAddress instanceof Inet4Address) {
ip = inetAddress.getHostAddress();
}
}
}
}
catch (SocketException ex) {
ex.printStackTrace();
}
return ip;
}
}

View File

@ -0,0 +1,30 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import com.google.inject.Guice;
import com.google.inject.Injector;
import com.google.inject.Key;
import com.google.inject.TypeLiteral;
import io.edurt.datacap.spi.Plugin;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import java.util.Set;
public class CeresDBPluginModuleTest
{
private Injector injector;
@Before
public void before()
{
this.injector = Guice.createInjector(new CeresDBPluginModule());
}
@Test
public void test()
{
Set<Plugin> plugins = injector.getInstance(Key.get(new TypeLiteral<Set<Plugin>>() {}));
Assert.assertTrue(plugins.size() > 0);
}
}

View File

@ -0,0 +1,67 @@
package io.edurt.datacap.plugin.natived.ceresdb;
import com.google.inject.Guice;
import com.google.inject.Injector;
import com.google.inject.Key;
import com.google.inject.TypeLiteral;
import io.edurt.datacap.spi.Plugin;
import io.edurt.datacap.spi.model.Configure;
import io.edurt.datacap.spi.model.Response;
import lombok.extern.slf4j.Slf4j;
import org.junit.Assert;
import org.junit.Before;
import org.junit.Test;
import org.testcontainers.containers.Network;
import org.testcontainers.lifecycle.Startables;
import java.util.Optional;
import java.util.Set;
import java.util.stream.Stream;
@Slf4j
public class CeresDBPluginTest
{
private static final String HOST = "ceresDBCluster";
private Network network;
private CeresDBContainer container;
private Injector injector;
private Configure configure;
@Before
public void before()
{
network = Network.newNetwork();
container = new CeresDBContainer().withNetwork(network).withNetworkAliases(HOST).withExposedPorts(CeresDBContainer.HTTP_PORT);
Startables.deepStart(Stream.of(container)).join();
log.info("CeresDB container started");
injector = Guice.createInjector(new CeresDBPluginModule());
configure = new Configure();
configure.setHost("localhost");
configure.setPort(container.getMappedPort(CeresDBContainer.HTTP_PORT));
configure.setDatabase(Optional.of("default"));
}
@Test
public void test()
{
Set<Plugin> plugins = injector.getInstance(Key.get(new TypeLiteral<Set<Plugin>>() {}));
Optional<Plugin> pluginOptional = plugins.stream().filter(v -> v.name().equalsIgnoreCase("CeresDB")).findFirst();
if (pluginOptional.isPresent()) {
Plugin plugin = pluginOptional.get();
plugin.connect(configure);
String sql = "SELECT * FROM system.public.tables";
Response response = plugin.execute(sql);
log.info("================ plugin executed information =================");
if (!response.getIsSuccessful()) {
log.error("Message: {}", response.getMessage());
}
else {
response.getColumns().forEach(column -> log.info(column.toString()));
}
Assert.assertTrue(response.getIsSuccessful());
plugin.destroy();
}
}
}

View File

@ -21,8 +21,10 @@
<module>plugin/datacap-native-zookeeper</module> <module>plugin/datacap-native-zookeeper</module>
<module>plugin/datacap-native-redis</module> <module>plugin/datacap-native-redis</module>
<module>plugin/datacap-native-kafka</module> <module>plugin/datacap-native-kafka</module>
<module>plugin/datacap-native-h2</module>
<module>plugin/datacap-http-cratedb</module> <module>plugin/datacap-http-cratedb</module>
<module>plugin/datacap-http-clickhouse</module> <module>plugin/datacap-http-clickhouse</module>
<module>plugin/datacap-http-ceresdb</module>
<module>plugin/datacap-jdbc-clickhouse</module> <module>plugin/datacap-jdbc-clickhouse</module>
<module>plugin/datacap-jdbc-cratedb</module> <module>plugin/datacap-jdbc-cratedb</module>
<module>plugin/datacap-jdbc-db2</module> <module>plugin/datacap-jdbc-db2</module>
@ -55,7 +57,6 @@
<module>plugin/datacap-jdbc-trino</module> <module>plugin/datacap-jdbc-trino</module>
<module>executor/datacap-executor-example</module> <module>executor/datacap-executor-example</module>
<module>executor/datacap-executor-seatunnel</module> <module>executor/datacap-executor-seatunnel</module>
<module>plugin/datacap-native-h2</module>
</modules> </modules>
<name>DataCap</name> <name>DataCap</name>
@ -115,6 +116,7 @@
<guava.version>31.1-jre</guava.version> <guava.version>31.1-jre</guava.version>
<commons-beanutils.version>1.9.4</commons-beanutils.version> <commons-beanutils.version>1.9.4</commons-beanutils.version>
<redis.version>3.8.0</redis.version> <redis.version>3.8.0</redis.version>
<testcontainers.version>1.17.6</testcontainers.version>
<assembly-plugin.version>3.5.0</assembly-plugin.version> <assembly-plugin.version>3.5.0</assembly-plugin.version>
<plugin.maven.checkstyle.version>3.0.0</plugin.maven.checkstyle.version> <plugin.maven.checkstyle.version>3.0.0</plugin.maven.checkstyle.version>
<plugin.maven.findbugs.version>3.0.5</plugin.maven.findbugs.version> <plugin.maven.findbugs.version>3.0.5</plugin.maven.findbugs.version>
@ -214,6 +216,11 @@
<artifactId>sql-formatter</artifactId> <artifactId>sql-formatter</artifactId>
<version>${sql-formatter.version}</version> <version>${sql-formatter.version}</version>
</dependency> </dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<version>${testcontainers.version}</version>
</dependency>
<dependency> <dependency>
<groupId>com.google.guava</groupId> <groupId>com.google.guava</groupId>
<artifactId>guava</artifactId> <artifactId>guava</artifactId>