Dolos
Dolos is a new type of Cardano node, fine-tuned to solve a very narrow scope: keeping an updated copy of the ledger and replying to queries from trusted clients, while requiring a small fraction of the resources
Motivation
Cardano nodes can assume one of two roles:
- block producer: in charge of minting blocks
- relay node: in charge relaying blocks from / to peers.
Each of these roles has concrete responsibilities and runtime requirements. Criteria such as network topology, resource allocation, backup procedures, etc vary by role.
We argue that there’s a 3rd role that should be treated independently with the goal of optimizing its workload: nodes that are used with the sole purpose of resolving local state queries or serving as data source for downstream tools that require ledger data.
There are many potential optimizations for nodes performing this type of workload that are not currently possible with the Cardano node:
- drastically limiting the amount of memory required to execute the node
- switching to storage solutions with different trade-offs (eg: S3, NFS, etc)
- providing alternative wire protocols more friendly for data queries (eg: REST, gRPC)
- providing an auth layer in front of the API endpoints
The goal of this project is to provide a very limited and focused version of the Cardano node that can be used by DevOps as a cost-effective, performant option to deploy data nodes side-by-side with the producer / relay nodes.
This new role would be useful in the following scenarios:
- As data source for well-known tools such as DB-sync, Ogmios, CARP, Oura, etc.
- As a fast, low resource node for syncing other producer / relay nodes.
- As a ledger data source that scales dynamically according to query load.
- As a node that leverages network / cloud storage technology instead of mounted drives.
- As a node that scales horizontally, allowing high-availability topologies.
- As a low resource local node for resolving local state queries.
Detailed design
Data nodes will share some of the features with the mainstream Cardano node:
- Node-to-Node and Node-to-Client Chain-Sync mini-protocol
- Node-to-Node Block-Fetch mini-protocol
- Node-to-Client Local-State-Query mini-protocol
This new type of node will also provide features not currently available in the mainstream Cardano node:
- HTTP/JSON endpoint for common local state queries
- gRPC endpoint for local state queries and chain-sync procedure
- Different storage options including NFS, S3 & GCP Buckets
- Low memory consumption (allowed by the trade-offs in scope)
Drawbacks
- Although the scope is very narrow compared to a real, full-blown node, this tool has a large LoE.
- There's overlap with some TxPipe tools such as Oura and Scrolls. The mitigation plan is to hoist individual components into Pallas to achieve DRY.
- Some components, such as the gRPC interface, might be useful even in environment running the full-blown Cardano node. To mitigate this we will architect the system in such a way that different entry-points (aka: binaries) can perform different roles. The gRPC bridge would be one of this.
Alternatives
- Use the full-blown Cardano node even for the scenarios described in this RFC.
- Split the project into sub-components that can be orchestrated to achieve the same result.
Unresolved questions
- Performance gains and resource allocation optimizations are theoretical, these were extrapolated from our experience implementing Cardano data processing pipelines using components written in Rust. We won’t have a strict, quantifiable measurement until we develop a PoC of this project. To mitigate this issue, our development process will include performance benchmarks execution at each development milestone. Reports will be included as part of each release.
- There’s some documentation lacking regarding local state queries wire-format which will need some reverse engineering from the mainstream Cardano node. We have experience with this approach but the level-of-effort associated with the task is hard to anticipate. To try mitigate this issue, we'll reach out to IOG for advise and documentation in case it's available.
Installation
Depending on your needs, Dolos provides different installation options:
- Binary Release: to use one of our pre-compiled binary releases for the supported platforms.
- From Source: to compile a binary from source code using Rust's toolchain
- Docker: to run the tool from a pre-built docker image
- Kubernetes: to deploy Dolos as a resource within a Kubernetes cluster
Binary Releases
Dolos can be run as a standalone executable. The Github release page includes the binaries for different OS and architectures. It's a self-contained, single-file binary that can be downloaded directly.
For simplicity, we also provide an install script that can be executed from your terminal to automate the installation process. This install script will only work for MacOS and Linux for either x86_64 and arm64 architecture. You can execute the install by running the following command from your terminal. You'll need curl and sudo access.
curl -fsSL https://github.com/txpipe/dolos/raw/main/.github/public/install.sh -o get-dolos.sh
sudo sh ./get-dolos.sh
From Source
The following instructions show how to build and install Dolos from source code.
Pre-requisites
- Rust toolchain
Procedure
git clone git@github.com:txpipe/dolos.git
cd dolos
cargo install --all-features --path .
Docker
Dolos provides already built public Docker images through Github Packages. To execute Dolos via Docker, use the following command:
docker run ghcr.io/txpipe/dolos:latest
The result of the above command should show Dolos' command-line help message.
Entry Point
The entry-point of the image points to Dolos executable. You can pass the same command-line arguments that you would pass to the binary release running bare-metal. For example:
docker run -it ghcr.io/txpipe/dolos:latest \
--config my-custom-config.toml
For more information on available command-line arguments, check the usage section.
Using a Configuration File
The default daemon configuration file for Dolos is located in /etc/dolos/daemon.toml
. To run Dolos in daemon mode with a custom configuration file, you need to mount it in the correct location. The following example runs a docker container in background using a configuration file named daemon.toml
located in the current folder:
docker run -d -v $(pwd)/daemon.toml:/etc/dolos/daemon.toml \
ghcr.io/txpipe/dolos:latest daemon
Versioned Images
Images are also tagged with the corresponding version number. It is highly recommended to use a fixed image version in production environments to avoid the effects of new features being included in each release (please remember dolos hasn't reached v1 stability guarantees).
To use a versioned image, replace the latest
tag by the desired version with the v
prefix. For example, to use version 0.6.0
, use the following image:
ghcr.io/txpipe/dolos:v0.6.0
Multiple Architectures
Dolos docker image is multi-arch, meaning that it can be used from different CPU architectures. We currently support amd64
(aka x86_64
) and arm64
.
The Docker daemon will detect your architecture and use the correct manifest to run the image. The usage procedures are the same regardless of the architecture.
Kubernetes
Dolos can be implemented as a standalone Kubernetes StatefulSet
resource.
Please note that the amount of replicas is set to 1
. Dolos doesn't have any kind of "coordination" between instances. Adding more than one replica will just create extra pipelines duplicating the same work.
apiVersion: v1
kind: ConfigMap
metadata:
name: dolos
data:
daemon.toml: |-
[upstream]
# REDACTED: here goes your `upstream` configuration options
[rolldb]
# REDACTED: here goes your `rolldb` configuration options
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
name: dolos
spec:
template:
spec:
containers:
- name: main
image: ghcr.io/txpipe/dolos:latest
# we mount the same volume that the main container uses as the source
# for the Cardano node unix socket.
volumeMounts:
- mountPath: /var/dolos/db
name: db
- mountPath: /etc/dolos
name: config
resources:
requests:
memory: 1Gi
cpu: 1
limits:
memory: 1Gi
volumes:
# an empty-dir to store your data. In a real scenario, this should be a PVC
- name: db
emptyDir: {}
# a config map resource with Dolos' config, particular for your requirements
- name: config
configMap:
name: config
Usage
This section provides information on how to configure Dolos.
Upstream
[upstream]
peer_address = "preview-node.world.dev.cardano.org:30002"
network_magic = 2
RollDB
[rolldb]
path = "./data"
k_param = 1000
Serve
[serve.grpc]
listen_address = "[::]:50051"
Byron
[byron]
path = "./byron.json"
Logging
[logging]
max_level = "debug"
Example Configuration
Preview Network
[upstream]
peer_address = "preview-node.world.dev.cardano.org:30002"
network_magic = 2
[rolldb]
path = "./data"
k_param = 1000
[serve.grpc]
listen_address = "[::]:50051"
[byron]
path = "./byron.json"
[logging]
max_level = "debug"
Usage
This section provides information on how to use Dolos.
- There are 2 ways to start the server that provides access to the endpoints: the
serve
command and thedaemon
command
Sync
serve
mode
The serve
command starts Dolos just with the purpose of exposing the client endpoint, no other functions are executed
To start Dolos in serve
mode run the following command from the terminal:
dolos serve
daemon
mode
The daemon
command starts Dolos with all of its main functions enabled (syncing from upstream, updating the ledger, etc) which includes the client endpoint server
To start Dolos in daemon
mode run the following command from the terminal:
dolos daemon
eval
command
The eval
is an utility for evaluating transactions. It takes tx data from an external source and uses the current ledger state to evaluate phase-1 validation rules.
Usage
To execute the evaluation, run the following command from your terminal:
dolos eval --file <FILE> --era <ERA> --magic <MAGIC> --slot <SLOT>
The args should be interpreted as:
--file <FILE>
: the path to the file containing the tx data as hex-encoded cbor.--era <ERA>
: the id of the era that should be used to interpret the transaction data.--magic <MAGIC>
: the protocol magic of the network.--slot <SLOT>
: the slot that should be used for retrieving protocol parameters.
API
gRPC
Introduction
- Dolos exposes a gRPC endpoint allowing clients to query data
- The endpoint adheres to the UTxO RPC interface definition found in https://utxorpc.org
- The current implemented module is the
Sync
module that allows clients to sync with the state of the chain stored by Dolos - Dolos endpoint also supports gRPC-web, a variant of gRPC that can be used directly from browsers
Connecting to the server
- Once started, the server exposes TCP port
- The default port number is
50051
, but can be changed via configuration - This port accepts http/2 connections following the standard gRPC mechanism
- The port also accepts http/1.1 connections following the gRPC-web protocol
- Developers can make use of UTxO-RPC SDK libraries to interact with the endpoint programmatically
Authentication Mechanism
- Dolos has a built-in mechanism for authenticating clients using TLS
- By specifying a specific CA authority, Dolos can allows clients that provide a matching certificate
- The CA authority is specified by pointing to the corresponding
.pem
file through configuration
Ouroboros
Benchmarks
Long-running operation - June test
A "long-running" test refers to a type of test that is designed to evaluate the system's behavior under sustained use over a long period of time. These tests can be used to identify problems that might not be apparent in short-term testing scenarios.
Among other things, long-running tests are often used to identify memory leaks, resource leaks, or degradation in system performance over time. If a system runs perfectly for an hour but starts having issues after several hours or days, a long-running test would help identify such problems.
In particular, we want to measure CPU and memory usages of Dolos compared to the Haskell node. The main goal of Dolos is to provide a lightweight alternative (with substantially reduced features) for data access use-cases. To perform the comparison, both implementations were hosted under the equivalent conditions:
- same hardware
- both fully-synced to the Preview network
- similar client request profile (1 chain-sync consumer)
CPU Usage
In this analysis we compare CPU usage. It is expressed as shares of a vCPU (~core). 1 share represents 1/1000 of a vCPU. Each bucket represent the average of shares utilized by each process in a 30-minute period. The information was gathered after continuous operations throughout a 2-day period.
Time | Dolos | Haskell |
---|---|---|
2023-06-19 08:30:00 | 2.11 | 222 |
2023-06-19 09:00:00 | 1.87 | 222 |
2023-06-19 09:30:00 | 2.00 | 231 |
2023-06-19 10:00:00 | 2.01 | 243 |
2023-06-19 10:30:00 | 1.98 | 240 |
2023-06-19 11:00:00 | 2.48 | 241 |
2023-06-19 11:30:00 | 2.16 | 226 |
2023-06-19 12:00:00 | 2.23 | 240 |
2023-06-19 12:30:00 | 2.12 | 241 |
2023-06-19 13:00:00 | 2.42 | 236 |
2023-06-19 13:30:00 | 2.11 | 232 |
2023-06-19 14:00:00 | 2.00 | 231 |
2023-06-19 14:30:00 | 2.06 | 278 |
2023-06-19 15:00:00 | 2.38 | 248 |
2023-06-19 15:30:00 | 1.97 | 224 |
2023-06-19 16:00:00 | 2.00 | 161 |
2023-06-19 16:30:00 | 2.25 | 236 |
2023-06-19 17:00:00 | 1.90 | 234 |
2023-06-19 17:30:00 | 1.98 | 236 |
2023-06-19 18:00:00 | 1.92 | 227 |
2023-06-19 18:30:00 | 2.18 | 232 |
2023-06-19 19:00:00 | 2.06 | 231 |
2023-06-19 19:30:00 | 2.25 | 242 |
2023-06-19 20:00:00 | 2.09 | 239 |
2023-06-19 20:30:00 | 2.25 | 250 |
2023-06-19 21:00:00 | 2.05 | 230 |
2023-06-19 21:30:00 | 1.96 | 229 |
2023-06-19 22:00:00 | 2.06 | 226 |
2023-06-19 22:30:00 | 2.22 | 237 |
2023-06-19 23:00:00 | 2.34 | 234 |
2023-06-19 23:30:00 | 2.36 | 238 |
2023-06-20 00:00:00 | 2.16 | 238 |
2023-06-20 00:30:00 | 2.46 | 236 |
2023-06-20 01:00:00 | 2.26 | 241 |
2023-06-20 01:30:00 | 2.21 | 242 |
2023-06-20 02:00:00 | 1.89 | 226 |
2023-06-20 02:30:00 | 2.53 | 252 |
2023-06-20 03:00:00 | 1.93 | 226 |
2023-06-20 03:30:00 | 2.06 | 232 |
2023-06-20 04:00:00 | 2.37 | 240 |
2023-06-20 04:30:00 | 2.65 | 250 |
2023-06-20 05:00:00 | 2.12 | 233 |
2023-06-20 05:30:00 | 2.41 | 329 |
2023-06-20 06:00:00 | 2.36 | 231 |
2023-06-20 06:30:00 | 2.80 | 273 |
2023-06-20 07:00:00 | 2.22 | 226 |
2023-06-20 07:30:00 | 2.04 | 217 |
2023-06-20 08:00:00 | 2.50 | 238 |
2023-06-20 08:30:00 | 2.24 | 228 |
2023-06-20 09:00:00 | 2.51 | 230 |
2023-06-20 09:30:00 | 2.31 | 235 |
2023-06-20 10:00:00 | 2.15 | 225 |
2023-06-20 10:30:00 | 2.36 | 231 |
2023-06-20 11:00:00 | 2.91 | 232 |
2023-06-20 11:30:00 | 2.83 | 239 |
2023-06-20 12:00:00 | 2.64 | 238 |
2023-06-20 12:30:00 | 2.76 | 231 |
2023-06-20 13:00:00 | 2.88 | 256 |
2023-06-20 13:30:00 | 2.79 | 233 |
2023-06-20 14:00:00 | 2.35 | 222 |
2023-06-20 14:30:00 | 2.12 | 223 |
2023-06-20 15:00:00 | 2.69 | 230 |
2023-06-20 15:30:00 | 2.30 | 228 |
2023-06-20 16:00:00 | 2.37 | 232 |
2023-06-20 16:30:00 | 2.45 | 241 |
2023-06-20 17:00:00 | 2.63 | 230 |
2023-06-20 17:30:00 | 3.08 | 239 |
2023-06-20 18:00:00 | 2.72 | 225 |
2023-06-20 18:30:00 | 2.40 | 222 |
2023-06-20 19:00:00 | 2.61 | 229 |
2023-06-20 19:30:00 | 2.60 | 231 |
2023-06-20 20:00:00 | 2.43 | 235 |
2023-06-20 20:30:00 | 2.62 | 229 |
2023-06-20 21:00:00 | 2.19 | 220 |
2023-06-20 21:30:00 | 2.62 | 232 |
2023-06-20 22:00:00 | 2.76 | 232 |
2023-06-20 22:30:00 | 2.01 | 244 |
2023-06-20 23:00:00 | 2.30 | 231 |
2023-06-20 23:30:00 | 2.22 | 252 |
2023-06-21 00:00:00 | 2.79 | 234 |
2023-06-21 00:30:00 | 2.33 | 233 |
2023-06-21 01:00:00 | 2.31 | 229 |
2023-06-21 01:30:00 | 2.56 | 222 |
2023-06-21 02:00:00 | 2.71 | 237 |
2023-06-21 02:30:00 | 2.51 | 248 |
2023-06-21 03:00:00 | 2.54 | 236 |
2023-06-21 03:30:00 | 2.25 | 233 |
2023-06-21 04:00:00 | 2.89 | 242 |
2023-06-21 04:30:00 | 2.35 | 229 |
2023-06-21 05:00:00 | 2.65 | 236 |
2023-06-21 05:30:00 | 2.18 | 222 |
2023-06-21 06:00:00 | 2.64 | 227 |
2023-06-21 06:30:00 | 3.03 | 239 |
2023-06-21 07:00:00 | 2.26 | 229 |
2023-06-21 07:30:00 | 3.81 | 268 |
2023-06-21 08:00:00 | 2.90 | 262 |
2023-06-21 08:30:00 | 2.85 | 247 |
Memory Usage
In this analysis we compare memory usage. It is expressed as total amount of data (KB, MB, GB). Each bucket represent the average of memory utilized by each process in a 30-minute period. The information was gathered after continuous operations throughout a 2-day period.
Time | Dolos | Haskell |
---|---|---|
2023-06-19 08:30:00 | 20.3 MB | 2.18 GB |
2023-06-19 09:00:00 | 20.4 MB | 2.18 GB |
2023-06-19 09:30:00 | 20.7 MB | 2.18 GB |
2023-06-19 10:00:00 | 20.9 MB | 2.18 GB |
2023-06-19 10:30:00 | 21.1 MB | 2.18 GB |
2023-06-19 11:00:00 | 21.1 MB | 2.18 GB |
2023-06-19 11:30:00 | 21.2 MB | 2.18 GB |
2023-06-19 12:00:00 | 21.4 MB | 2.18 GB |
2023-06-19 12:30:00 | 21.7 MB | 2.18 GB |
2023-06-19 13:00:00 | 21.9 MB | 2.18 GB |
2023-06-19 13:30:00 | 21.9 MB | 2.18 GB |
2023-06-19 14:00:00 | 22.2 MB | 2.18 GB |
2023-06-19 14:30:00 | 22.3 MB | 2.18 GB |
2023-06-19 15:00:00 | 22.5 MB | 2.18 GB |
2023-06-19 15:30:00 | 22.8 MB | 2.18 GB |
2023-06-19 16:00:00 | 23.0 MB | 2.18 GB |
2023-06-19 16:30:00 | 23.5 MB | 2.18 GB |
2023-06-19 17:00:00 | 24.1 MB | 2.18 GB |
2023-06-19 17:30:00 | 24.7 MB | 2.18 GB |
2023-06-19 18:00:00 | 25.3 MB | 2.18 GB |
2023-06-19 18:30:00 | 26.0 MB | 2.18 GB |
2023-06-19 19:00:00 | 26.6 MB | 2.18 GB |
2023-06-19 19:30:00 | 27.3 MB | 2.18 GB |
2023-06-19 20:00:00 | 27.8 MB | 2.18 GB |
2023-06-19 20:30:00 | 28.5 MB | 2.18 GB |
2023-06-19 21:00:00 | 28.9 MB | 2.18 GB |
2023-06-19 21:30:00 | 29.5 MB | 2.18 GB |
2023-06-19 22:00:00 | 30.1 MB | 2.18 GB |
2023-06-19 22:30:00 | 30.6 MB | 2.18 GB |
2023-06-19 23:00:00 | 31.2 MB | 2.18 GB |
2023-06-19 23:30:00 | 32.0 MB | 2.18 GB |
2023-06-20 00:00:00 | 32.5 MB | 2.18 GB |
2023-06-20 00:30:00 | 33.2 MB | 2.18 GB |
2023-06-20 01:00:00 | 33.8 MB | 2.18 GB |
2023-06-20 01:30:00 | 34.5 MB | 2.18 GB |
2023-06-20 02:00:00 | 34.7 MB | 2.18 GB |
2023-06-20 02:30:00 | 34.8 MB | 2.18 GB |
2023-06-20 03:00:00 | 35.1 MB | 2.18 GB |
2023-06-20 03:30:00 | 35.1 MB | 2.18 GB |
2023-06-20 04:00:00 | 35.3 MB | 2.18 GB |
2023-06-20 04:30:00 | 35.1 MB | 2.18 GB |
2023-06-20 05:00:00 | 35.5 MB | 2.18 GB |
2023-06-20 05:30:00 | 35.8 MB | 2.18 GB |
2023-06-20 06:00:00 | 36.0 MB | 2.18 GB |
2023-06-20 06:30:00 | 36.1 MB | 2.18 GB |
2023-06-20 07:00:00 | 36.5 MB | 2.18 GB |
2023-06-20 07:30:00 | 36.6 MB | 2.18 GB |
2023-06-20 08:00:00 | 36.9 MB | 2.18 GB |
2023-06-20 08:30:00 | 37.1 MB | 2.18 GB |
2023-06-20 09:00:00 | 37.0 MB | 2.18 GB |
2023-06-20 09:30:00 | 37.2 MB | 2.18 GB |
2023-06-20 10:00:00 | 37.5 MB | 2.18 GB |
2023-06-20 10:30:00 | 37.5 MB | 2.18 GB |
2023-06-20 11:00:00 | 37.6 MB | 2.18 GB |
2023-06-20 11:30:00 | 38.1 MB | 2.18 GB |
2023-06-20 12:00:00 | 38.1 MB | 2.18 GB |
2023-06-20 12:30:00 | 38.2 MB | 2.18 GB |
2023-06-20 13:00:00 | 38.5 MB | 2.18 GB |
2023-06-20 13:30:00 | 38.5 MB | 2.18 GB |
2023-06-20 14:00:00 | 38.8 MB | 2.18 GB |
2023-06-20 14:30:00 | 39.0 MB | 2.18 GB |
2023-06-20 15:00:00 | 39.2 MB | 2.18 GB |
2023-06-20 15:30:00 | 39.2 MB | 2.18 GB |
2023-06-20 16:00:00 | 39.3 MB | 2.18 GB |
2023-06-20 16:30:00 | 39.6 MB | 2.18 GB |
2023-06-20 17:00:00 | 39.7 MB | 2.18 GB |
2023-06-20 17:30:00 | 39.7 MB | 2.18 GB |
2023-06-20 18:00:00 | 39.9 MB | 2.18 GB |
2023-06-20 18:30:00 | 40.2 MB | 2.18 GB |
2023-06-20 19:00:00 | 40.1 MB | 2.18 GB |
2023-06-20 19:30:00 | 40.3 MB | 2.18 GB |
2023-06-20 20:00:00 | 40.5 MB | 2.18 GB |
2023-06-20 20:30:00 | 40.7 MB | 2.18 GB |
2023-06-20 21:00:00 | 40.7 MB | 2.18 GB |
2023-06-20 21:30:00 | 40.8 MB | 2.18 GB |
2023-06-20 22:00:00 | 40.9 MB | 2.18 GB |
2023-06-20 22:30:00 | 41.1 MB | 2.18 GB |
2023-06-20 23:00:00 | 41.1 MB | 2.18 GB |
2023-06-20 23:30:00 | 41.2 MB | 2.18 GB |
2023-06-21 00:00:00 | 41.4 MB | 2.18 GB |
2023-06-21 00:30:00 | 41.5 MB | 2.18 GB |
2023-06-21 01:00:00 | 41.8 MB | 2.18 GB |
2023-06-21 01:30:00 | 41.8 MB | 2.18 GB |
2023-06-21 02:00:00 | 41.9 MB | 2.18 GB |
2023-06-21 02:30:00 | 42.3 MB | 2.18 GB |
2023-06-21 03:00:00 | 42.2 MB | 2.18 GB |
2023-06-21 03:30:00 | 42.5 MB | 2.18 GB |
2023-06-21 04:00:00 | 42.7 MB | 2.18 GB |
2023-06-21 04:30:00 | 42.2 MB | 2.18 GB |
2023-06-21 05:00:00 | 42.7 MB | 2.18 GB |
2023-06-21 05:30:00 | 43.1 MB | 2.18 GB |
2023-06-21 06:00:00 | 43.4 MB | 2.18 GB |
2023-06-21 06:30:00 | 43.5 MB | 2.18 GB |
2023-06-21 07:00:00 | 43.7 MB | 2.18 GB |
2023-06-21 07:30:00 | 16.3 MB | 2.18 GB |
2023-06-21 08:00:00 | 16.7 MB | 2.18 GB |
2023-06-21 08:30:00 | 17.1 MB | 2.18 GB |
Long-running operation - October test
A "long-running" test refers to a type of test that is designed to evaluate the system's behavior under sustained use over a long period of time. These tests can be used to identify problems that might not be apparent in short-term testing scenarios.
Among other things, long-running tests are often used to identify memory leaks, resource leaks, or degradation in system performance over time. If a system runs perfectly for an hour but starts having issues after several hours or days, a long-running test would help identify such problems.
In particular, we want to measure CPU and memory usages of Dolos compared to the Haskell node. The main goal of Dolos is to provide a lightweight alternative (with substantially reduced features) for data access use-cases. To perform the comparison, both implementations were hosted under the equivalent conditions:
- same hardware
- both fully-synced to the Preview network
- similar client request profile (1 chain-sync consumer)
CPU Usage
In this analysis we compare CPU usage. It is expressed as shares of a vCPU (~core). 1 share represents 1/1000 of a vCPU. Each bucket represent the average of shares utilized by each process in a 15-minute period. The information was gathered after continuous operations throughout a 24 hr period.
Time | Dolos | Haskell |
---|---|---|
2023-10-19 14:45:00 | 26.9 | 302 |
2023-10-19 15:00:00 | 21.8 | 287 |
2023-10-19 15:15:00 | 27.8 | 309 |
2023-10-19 15:30:00 | 28.2 | 303 |
2023-10-19 15:45:00 | 29.0 | 301 |
2023-10-19 16:00:00 | 29.3 | 221 |
2023-10-19 16:15:00 | 30.5 | 327 |
2023-10-19 16:30:00 | 26.5 | 302 |
2023-10-19 16:45:00 | 26.9 | 300 |
2023-10-19 17:00:00 | 28.5 | 307 |
2023-10-19 17:15:00 | 26.1 | 300 |
2023-10-19 17:30:00 | 30.8 | 311 |
2023-10-19 17:45:00 | 22.9 | 285 |
2023-10-19 18:00:00 | 24.0 | 290 |
2023-10-19 18:15:00 | 26.5 | 303 |
2023-10-19 18:30:00 | 22.7 | 287 |
2023-10-19 18:45:00 | 22.7 | 293 |
2023-10-19 19:00:00 | 25.7 | 297 |
2023-10-19 19:15:00 | 21.0 | 291 |
2023-10-19 19:30:00 | 23.0 | 301 |
2023-10-19 19:45:00 | 22.5 | 297 |
2023-10-19 20:00:00 | 23.4 | 307 |
2023-10-19 20:15:00 | 25.7 | 302 |
2023-10-19 20:30:00 | 25.0 | 306 |
2023-10-19 20:45:00 | 25.6 | 302 |
2023-10-19 21:00:00 | 25.4 | 297 |
2023-10-19 21:15:00 | 23.5 | 300 |
2023-10-19 21:30:00 | 25.4 | 301 |
2023-10-19 21:45:00 | 22.2 | 303 |
2023-10-19 22:00:00 | 23.9 | 298 |
2023-10-19 22:15:00 | 26.7 | 296 |
2023-10-19 22:30:00 | 22.7 | 290 |
2023-10-19 22:45:00 | 25.5 | 300 |
2023-10-19 23:00:00 | 22.6 | 300 |
2023-10-19 23:15:00 | 22.8 | 290 |
2023-10-19 23:30:00 | 25.4 | 294 |
2023-10-19 23:45:00 | 23.6 | 301 |
2023-10-20 00:00:00 | 24.0 | 295 |
2023-10-20 00:15:00 | 21.9 | 301 |
2023-10-20 00:30:00 | 22.7 | 290 |
2023-10-20 00:45:00 | 23.9 | 302 |
2023-10-20 01:00:00 | 23.7 | 303 |
2023-10-20 01:15:00 | 23.4 | 288 |
2023-10-20 01:30:00 | 23.7 | 297 |
2023-10-20 01:45:00 | 23.5 | 296 |
2023-10-20 02:00:00 | 27.2 | 303 |
2023-10-20 02:15:00 | 22.2 | 290 |
2023-10-20 02:30:00 | 23.9 | 291 |
2023-10-20 02:45:00 | 23.2 | 289 |
2023-10-20 03:00:00 | 25.8 | 307 |
2023-10-20 03:15:00 | 26.3 | 290 |
2023-10-20 03:30:00 | 25.3 | 300 |
2023-10-20 03:45:00 | 22.0 | 290 |
2023-10-20 04:00:00 | 26.6 | 299 |
2023-10-20 04:15:00 | 26.4 | 301 |
2023-10-20 04:30:00 | 27.0 | 293 |
2023-10-20 04:45:00 | 26.8 | 302 |
2023-10-20 05:00:00 | 31.1 | 309 |
2023-10-20 05:15:00 | 24.7 | 290 |
2023-10-20 05:30:00 | 28.1 | 304 |
2023-10-20 05:45:00 | 24.1 | 282 |
2023-10-20 06:00:00 | 29.3 | 308 |
2023-10-20 06:15:00 | 25.3 | 298 |
2023-10-20 06:30:00 | 24.3 | 287 |
2023-10-20 06:45:00 | 32.2 | 317 |
2023-10-20 07:00:00 | 26.8 | 288 |
2023-10-20 07:15:00 | 27.0 | 295 |
2023-10-20 07:30:00 | 30.3 | 306 |
2023-10-20 07:45:00 | 28.6 | 300 |
2023-10-20 08:00:00 | 32.2 | 316 |
2023-10-20 08:15:00 | 32.4 | 301 |
2023-10-20 08:30:00 | 31.0 | 302 |
2023-10-20 08:45:00 | 41.5 | 310 |
2023-10-20 09:00:00 | 22.9 | 285 |
2023-10-20 09:15:00 | 31.8 | 302 |
2023-10-20 09:30:00 | 28.5 | 285 |
2023-10-20 09:45:00 | 31.0 | 299 |
2023-10-20 10:00:00 | 26.1 | 285 |
2023-10-20 10:15:00 | 32.2 | 309 |
2023-10-20 10:30:00 | 26.8 | 296 |
2023-10-20 10:45:00 | 25.3 | 285 |
2023-10-20 11:00:00 | 33.7 | 306 |
2023-10-20 11:15:00 | 31.0 | 303 |
2023-10-20 11:30:00 | 30.9 | 293 |
2023-10-20 11:45:00 | 30.1 | 308 |
2023-10-20 12:00:00 | 32.8 | 299 |
2023-10-20 12:15:00 | 23.9 | 293 |
2023-10-20 12:30:00 | 29.3 | 302 |
2023-10-20 12:45:00 | 32.7 | 299 |
2023-10-20 13:00:00 | 28.7 | 304 |
2023-10-20 13:15:00 | 32.9 | 307 |
2023-10-20 13:30:00 | 37.5 | 310 |
2023-10-20 13:45:00 | 32.0 | 309 |
2023-10-20 14:00:00 | 32.2 | 298 |
2023-10-20 14:15:00 | 33.1 | 309 |
2023-10-20 14:30:00 | 34.2 | 313 |
2023-10-20 14:45:00 | 35.7 | 308 |
Memory Usage
In this analysis we compare memory usage. It is expressed as total amount of data (KB, MB, GB). Each bucket represent the average of memory utilized by each process in a 15-minute period. The information was gathered after continuous operations throughout a 24 hr period.
Time | Dolos | Haskell |
---|---|---|
2023-10-19 14:45:00 | 62.3 MB | 2.44 GB |
2023-10-19 15:00:00 | 63.2 MB | 2.44 GB |
2023-10-19 15:15:00 | 61.6 MB | 2.44 GB |
2023-10-19 15:30:00 | 62.5 MB | 2.44 GB |
2023-10-19 15:45:00 | 63.3 MB | 2.44 GB |
2023-10-19 16:00:00 | 63.0 MB | 2.44 GB |
2023-10-19 16:15:00 | 63.9 MB | 2.44 GB |
2023-10-19 16:30:00 | 64.4 MB | 2.44 GB |
2023-10-19 16:45:00 | 64.8 MB | 2.44 GB |
2023-10-19 17:00:00 | 64.9 MB | 2.44 GB |
2023-10-19 17:15:00 | 65.2 MB | 2.44 GB |
2023-10-19 17:30:00 | 65.8 MB | 2.44 GB |
2023-10-19 17:45:00 | 66.0 MB | 2.44 GB |
2023-10-19 18:00:00 | 66.1 MB | 2.44 GB |
2023-10-19 18:15:00 | 54.7 MB | 2.44 GB |
2023-10-19 18:30:00 | 55.4 MB | 2.44 GB |
2023-10-19 18:45:00 | 55.9 MB | 2.44 GB |
2023-10-19 19:00:00 | 57.0 MB | 2.44 GB |
2023-10-19 19:15:00 | 57.3 MB | 2.44 GB |
2023-10-19 19:30:00 | 57.6 MB | 2.44 GB |
2023-10-19 19:45:00 | 57.9 MB | 2.44 GB |
2023-10-19 20:00:00 | 58.4 MB | 2.44 GB |
2023-10-19 20:15:00 | 58.8 MB | 2.44 GB |
2023-10-19 20:30:00 | 59.0 MB | 2.44 GB |
2023-10-19 20:45:00 | 59.4 MB | 2.44 GB |
2023-10-19 21:00:00 | 59.6 MB | 2.44 GB |
2023-10-19 21:15:00 | 60.0 MB | 2.44 GB |
2023-10-19 21:30:00 | 60.2 MB | 2.44 GB |
2023-10-19 21:45:00 | 60.5 MB | 2.44 GB |
2023-10-19 22:00:00 | 61.0 MB | 2.44 GB |
2023-10-19 22:15:00 | 61.1 MB | 2.44 GB |
2023-10-19 22:30:00 | 61.5 MB | 2.44 GB |
2023-10-19 22:45:00 | 62.0 MB | 2.44 GB |
2023-10-19 23:00:00 | 62.3 MB | 2.44 GB |
2023-10-19 23:15:00 | 62.6 MB | 2.44 GB |
2023-10-19 23:30:00 | 62.8 MB | 2.44 GB |
2023-10-19 23:45:00 | 63.2 MB | 2.44 GB |
2023-10-20 00:00:00 | 63.4 MB | 2.44 GB |
2023-10-20 00:15:00 | 63.8 MB | 2.44 GB |
2023-10-20 00:30:00 | 64.3 MB | 2.44 GB |
2023-10-20 00:45:00 | 64.6 MB | 2.44 GB |
2023-10-20 01:00:00 | 64.9 MB | 2.44 GB |
2023-10-20 01:15:00 | 65.2 MB | 2.44 GB |
2023-10-20 01:30:00 | 65.4 MB | 2.44 GB |
2023-10-20 01:45:00 | 66.0 MB | 2.44 GB |
2023-10-20 02:00:00 | 66.4 MB | 2.44 GB |
2023-10-20 02:15:00 | 66.7 MB | 2.44 GB |
2023-10-20 02:30:00 | 66.9 MB | 2.44 GB |
2023-10-20 02:45:00 | 67.3 MB | 2.44 GB |
2023-10-20 03:00:00 | 68.0 MB | 2.44 GB |
2023-10-20 03:15:00 | 68.5 MB | 2.44 GB |
2023-10-20 03:30:00 | 69.0 MB | 2.44 GB |
2023-10-20 03:45:00 | 69.2 MB | 2.44 GB |
2023-10-20 04:00:00 | 69.4 MB | 2.44 GB |
2023-10-20 04:15:00 | 69.6 MB | 2.44 GB |
2023-10-20 04:30:00 | 70.1 MB | 2.44 GB |
2023-10-20 04:45:00 | 70.5 MB | 2.44 GB |
2023-10-20 05:00:00 | 70.8 MB | 2.44 GB |
2023-10-20 05:15:00 | 71.2 MB | 2.44 GB |
2023-10-20 05:30:00 | 71.5 MB | 2.44 GB |
2023-10-20 05:45:00 | 71.9 MB | 2.44 GB |
2023-10-20 06:00:00 | 72.2 MB | 2.44 GB |
2023-10-20 06:15:00 | 72.6 MB | 2.44 GB |
2023-10-20 06:30:00 | 72.7 MB | 2.44 GB |
2023-10-20 06:45:00 | 73.0 MB | 2.44 GB |
2023-10-20 07:00:00 | 73.3 MB | 2.44 GB |
2023-10-20 07:15:00 | 73.6 MB | 2.44 GB |
2023-10-20 07:30:00 | 74.0 MB | 2.44 GB |
2023-10-20 07:45:00 | 74.4 MB | 2.44 GB |
2023-10-20 08:00:00 | 74.9 MB | 2.44 GB |
2023-10-20 08:15:00 | 75.4 MB | 2.44 GB |
2023-10-20 08:30:00 | 75.9 MB | 2.44 GB |
2023-10-20 08:45:00 | 79.5 MB | 2.44 GB |
2023-10-20 09:00:00 | 79.6 MB | 2.44 GB |
2023-10-20 09:15:00 | 79.9 MB | 2.44 GB |
2023-10-20 09:30:00 | 80.3 MB | 2.44 GB |
2023-10-20 09:45:00 | 80.9 MB | 2.44 GB |
2023-10-20 10:00:00 | 81.3 MB | 2.44 GB |
2023-10-20 10:15:00 | 82.0 MB | 2.44 GB |
2023-10-20 10:30:00 | 82.4 MB | 2.44 GB |
2023-10-20 10:45:00 | 82.8 MB | 2.44 GB |
2023-10-20 11:00:00 | 83.3 MB | 2.44 GB |
2023-10-20 11:15:00 | 83.4 MB | 2.44 GB |
2023-10-20 11:30:00 | 83.9 MB | 2.44 GB |
2023-10-20 11:45:00 | 84.2 MB | 2.44 GB |
2023-10-20 12:00:00 | 84.7 MB | 2.44 GB |
2023-10-20 12:15:00 | 84.8 MB | 2.44 GB |
2023-10-20 12:30:00 | 85.5 MB | 2.44 GB |
2023-10-20 12:45:00 | 85.8 MB | 2.44 GB |
2023-10-20 13:00:00 | 86.1 MB | 2.44 GB |
2023-10-20 13:15:00 | 86.9 MB | 2.44 GB |
2023-10-20 13:30:00 | 88.1 MB | 2.44 GB |
2023-10-20 13:45:00 | 88.4 MB | 2.44 GB |
2023-10-20 14:00:00 | 88.6 MB | 2.44 GB |
2023-10-20 14:15:00 | 89.0 MB | 2.44 GB |
2023-10-20 14:30:00 | 89.2 MB | 2.44 GB |
2023-10-20 14:45:00 | 89.4 MB | 2.44 GB |
Byron Phase-1 Sync - Resource Footprint Test
A "resource footprint" test refers to a type of test designed to evaluate the amount of resources, usually memory and CPU, that a software component requires to perform a particular action.
Among other things, these tests are often used to ensure that theoretical estimations of resource consumption match reality. These test are also useful as reference profiles that allow system operators to anticipate resource requirements.
The goal of this particular test instance is to understand the impact of Byron Phase-1 validations on Dolos resource footprint.
The data presented was gathered during a chain sync process against a single upstream Haskell node while executing Byron Phase-1 validations. The Dolos version used for this test matches the code in branch feat/byron-phase-1-validations
CPU Usage
In this analysis we track CPU usage. It is expressed as shares of a vCPU (~core). 1 share represents 1/1000 of a vCPU. Each bucket represent the average of shares utilized by each process in a 15-minute period. The information was gathered after continuous operations throughout a 1-day period.
Time | CPU |
---|---|
2023-11-11 12:00:00 | 28501 |
2023-11-11 12:15:00 | 20686 |
2023-11-11 12:30:00 | 26710 |
2023-11-11 12:45:00 | 31660 |
2023-11-11 13:00:00 | 36823 |
2023-11-11 13:15:00 | 39627 |
2023-11-11 13:30:00 | 47347 |
2023-11-11 13:45:00 | 45836 |
2023-11-11 14:00:00 | 33164 |
2023-11-11 14:15:00 | 38352 |
2023-11-11 14:30:00 | 41195 |
2023-11-11 14:45:00 | 48371 |
2023-11-11 15:00:00 | 49461 |
2023-11-11 15:15:00 | 49989 |
2023-11-11 15:30:00 | 38401 |
2023-11-11 15:45:00 | 8230 |
2023-11-11 16:00:00 | 13949 |
2023-11-11 16:15:00 | 19537 |
2023-11-11 16:30:00 | 24954 |
2023-11-11 16:45:00 | 28684 |
2023-11-11 17:00:00 | 35054 |
2023-11-11 17:15:00 | 24448 |
2023-11-11 17:30:00 | 23852 |
2023-11-11 17:45:00 | 29125 |
2023-11-11 18:00:00 | 32382 |
2023-11-11 18:15:00 | 39469 |
2023-11-11 18:30:00 | 44484 |
2023-11-11 18:45:00 | 48805 |
2023-11-11 19:00:00 | 40635 |
2023-11-11 19:15:00 | 33528 |
2023-11-11 19:30:00 | 40810 |
2023-11-11 19:45:00 | 46095 |
2023-11-11 20:00:00 | 49791 |
2023-11-11 20:15:00 | 49991 |
2023-11-11 20:30:00 | 48287 |
2023-11-11 20:45:00 | 49300 |
2023-11-11 21:00:00 | 15964 |
2023-11-11 21:15:00 | 10406 |
2023-11-11 21:30:00 | 15918 |
2023-11-11 21:45:00 | 19983 |
2023-11-11 22:00:00 | 26377 |
2023-11-11 22:15:00 | 31566 |
2023-11-11 22:30:00 | 37046 |
2023-11-11 22:45:00 | 29951 |
2023-11-11 23:00:00 | 22406 |
2023-11-11 23:15:00 | 27450 |
2023-11-11 23:30:00 | 33162 |
2023-11-11 23:45:00 | 38081 |
2023-11-12 00:00:00 | 40905 |
2023-11-12 00:15:00 | 48397 |
2023-11-12 00:30:00 | 49994 |
2023-11-12 00:45:00 | 37848 |
2023-11-12 01:00:00 | 38645 |
2023-11-12 01:15:00 | 40736 |
2023-11-12 01:30:00 | 48010 |
2023-11-12 01:45:00 | 49992 |
2023-11-12 02:00:00 | 49993 |
2023-11-12 02:15:00 | 49997 |
2023-11-12 02:30:00 | 48944 |
2023-11-12 02:45:00 | 43698 |
2023-11-12 03:00:00 | 6808 |
2023-11-12 03:15:00 | 12400 |
2023-11-12 03:30:00 | 18093 |
2023-11-12 03:45:00 | 21877 |
2023-11-12 04:00:00 | 28027 |
2023-11-12 04:15:00 | 33505 |
2023-11-12 04:30:00 | 38642 |
2023-11-12 04:45:00 | 26414 |
2023-11-12 05:00:00 | 23685 |
2023-11-12 05:15:00 | 28791 |
2023-11-12 05:30:00 | 34754 |
2023-11-12 05:45:00 | 40414 |
2023-11-12 06:00:00 | 41367 |
2023-11-12 06:15:00 | 48390 |
2023-11-12 06:30:00 | 49436 |
2023-11-12 06:45:00 | 38004 |
2023-11-12 07:00:00 | 38935 |
2023-11-12 07:15:00 | 41382 |
2023-11-12 07:30:00 | 48339 |
2023-11-12 07:45:00 | 49987 |
2023-11-12 08:00:00 | 48812 |
2023-11-12 08:15:00 | 49991 |
2023-11-12 08:30:00 | 46078 |
2023-11-12 08:45:00 | 6153 |
2023-11-12 09:00:00 | 11884 |
2023-11-12 09:15:00 | 16960 |
2023-11-12 09:30:00 | 22209 |
2023-11-12 09:45:00 | 25535 |
2023-11-12 10:00:00 | 32311 |
2023-11-12 10:15:00 | 37827 |
2023-11-12 10:30:00 | 26411 |
2023-11-12 10:45:00 | 23260 |
2023-11-12 11:00:00 | 26849 |
2023-11-12 11:15:00 | 33426 |
2023-11-12 11:30:00 | 38271 |
2023-11-12 11:45:00 | 43852 |
2023-11-12 12:00:00 | 47924 |
Memory Usage
In this analysis we track memory usage. It is expressed as total amount of data (KB, MB, GB). Each bucket represent the average of memory utilized by each process in a 15-minute period. The information was gathered after continuous operations throughout a 1-day period.
Time | Memory |
---|---|
2023-11-11 12:00:00 | 446 MB |
2023-11-11 12:15:00 | 448 MB |
2023-11-11 12:30:00 | 452 MB |
2023-11-11 12:45:00 | 453 MB |
2023-11-11 13:00:00 | 453 MB |
2023-11-11 13:15:00 | 454 MB |
2023-11-11 13:30:00 | 433 MB |
2023-11-11 13:45:00 | 428 MB |
2023-11-11 14:00:00 | 431 MB |
2023-11-11 14:15:00 | 436 MB |
2023-11-11 14:30:00 | 442 MB |
2023-11-11 14:45:00 | 444 MB |
2023-11-11 15:00:00 | 446 MB |
2023-11-11 15:15:00 | 454 MB |
2023-11-11 15:30:00 | 460 MB |
2023-11-11 15:45:00 | 462 MB |
2023-11-11 16:00:00 | 464 MB |
2023-11-11 16:15:00 | 447 MB |
2023-11-11 16:30:00 | 452 MB |
2023-11-11 16:45:00 | 455 MB |
2023-11-11 17:00:00 | 457 MB |
2023-11-11 17:15:00 | 446 MB |
2023-11-11 17:30:00 | 447 MB |
2023-11-11 17:45:00 | 449 MB |
2023-11-11 18:00:00 | 449 MB |
2023-11-11 18:15:00 | 434 MB |
2023-11-11 18:30:00 | 434 MB |
2023-11-11 18:45:00 | 443 MB |
2023-11-11 19:00:00 | 442 MB |
2023-11-11 19:15:00 | 447 MB |
2023-11-11 19:30:00 | 453 MB |
2023-11-11 19:45:00 | 456 MB |
2023-11-11 20:00:00 | 457 MB |
2023-11-11 20:15:00 | 459 MB |
2023-11-11 20:30:00 | 460 MB |
2023-11-11 20:45:00 | 460 MB |
2023-11-11 21:00:00 | 458 MB |
2023-11-11 21:15:00 | 458 MB |
2023-11-11 21:30:00 | 460 MB |
2023-11-11 21:45:00 | 461 MB |
2023-11-11 22:00:00 | 461 MB |
2023-11-11 22:15:00 | 462 MB |
2023-11-11 22:30:00 | 431 MB |
2023-11-11 22:45:00 | 421 MB |
2023-11-11 23:00:00 | 422 MB |
2023-11-11 23:15:00 | 422 MB |
2023-11-11 23:30:00 | 423 MB |
2023-11-11 23:45:00 | 427 MB |
2023-11-12 00:00:00 | 430 MB |
2023-11-12 00:15:00 | 433 MB |
2023-11-12 00:30:00 | 442 MB |
2023-11-12 00:45:00 | 445 MB |
2023-11-12 01:00:00 | 446 MB |
2023-11-12 01:15:00 | 446 MB |
2023-11-12 01:30:00 | 448 MB |
2023-11-12 01:45:00 | 449 MB |
2023-11-12 02:00:00 | 449 MB |
2023-11-12 02:15:00 | 450 MB |
2023-11-12 02:30:00 | 450 MB |
2023-11-12 02:45:00 | 433 MB |
2023-11-12 03:00:00 | 435 MB |
2023-11-12 03:15:00 | 436 MB |
2023-11-12 03:30:00 | 436 MB |
2023-11-12 03:45:00 | 437 MB |
2023-11-12 04:00:00 | 439 MB |
2023-11-12 04:15:00 | 443 MB |
2023-11-12 04:30:00 | 454 MB |
2023-11-12 04:45:00 | 449 MB |
2023-11-12 05:00:00 | 451 MB |
2023-11-12 05:15:00 | 452 MB |
2023-11-12 05:30:00 | 453 MB |
2023-11-12 05:45:00 | 455 MB |
2023-11-12 06:00:00 | 456 MB |
2023-11-12 06:15:00 | 429 MB |
2023-11-12 06:30:00 | 437 MB |
2023-11-12 06:45:00 | 441 MB |
2023-11-12 07:00:00 | 442 MB |
2023-11-12 07:15:00 | 443 MB |
2023-11-12 07:30:00 | 444 MB |
2023-11-12 07:45:00 | 444 MB |
2023-11-12 08:00:00 | 444 MB |
2023-11-12 08:15:00 | 461 MB |
2023-11-12 08:30:00 | 458 MB |
2023-11-12 08:45:00 | 460 MB |
2023-11-12 09:00:00 | 461 MB |
2023-11-12 09:15:00 | 460 MB |
2023-11-12 09:30:00 | 461 MB |
2023-11-12 09:45:00 | 461 MB |
2023-11-12 10:00:00 | 462 MB |
2023-11-12 10:15:00 | 462 MB |
2023-11-12 10:30:00 | 448 MB |
2023-11-12 10:45:00 | 450 MB |
2023-11-12 11:00:00 | 450 MB |
2023-11-12 11:15:00 | 450 MB |
2023-11-12 11:30:00 | 450 MB |
2023-11-12 11:45:00 | 450 MB |
2023-11-12 12:00:00 | 457 MB |