Skip to main content

Introducing Timbermill — A better logging service

Eden Gal
Aug 05 - 5 min read

Logging is probably one of the most basic fundamentals in code writing.
Whether you are writing your first Hello_World application or the code that will make a rover land on Mars, you will need some kind of logs to understand what’s going on in your application.

The problem with traditional logs is that in investigating a problem in your system, you will ultimately spend a lot of time converting the logs into a clear, time-lined flow representation of what actually occurred.

In essence, you are taking unstructured text and trying to answer the questions:

  • Where and when did the problem occur?
  • What was the flow across the system that preceded the problematic event?
  • What was the event that triggered the flow?
  • What were the parameters’ values in the flow?
  • When did each event begin? When did it end?
  • Which flows run simultaneously?
What you’re trying to achieve

While considering all these factors in a small application is not very difficult, it becomes almost impossible when dealing with big, distributed, asynchronous systems with massive amounts of text logs.

Good luck. Photo by Christa Dodoo on Unsplash

When we at Datorama (a Salesforce company) were first faced with this problem, we started searching for a tool or technology that would make this process easier. After finding none, we decided to create Timbermill, a new kind of task-based, context-aware logging service.

Unlike other logging mechanisms, Timbermill doesn’t consist of text logs, it consists of Task logs. Every log entity in Timbermill is called a Task, and each task has a unique ID, a name, the time it started, the time it ended, its duration, information of its origins and various properties that were assigned to it. Every task also holds a status property which states whether it was closed successfully, closed with an error, or not closed at all.

Each task in Timbermill either has a connection to another task — its parent task — or it’s a primary task with no parent. A task’s parent can be defined either at creation or is automatically assigned when two tasks are opened under the same thread whose initial task was not yet closed.

A task representation in Kibana

Use-case example

Consider this simple application:

@TimberLog(name = "main_application")
public static void main(String[] args) {
  TimberLogger.logContext("userId", userId);
  isUserAuthorized(userId);
  String sql = getSqlQuery(args);
  List<String> result = runSql(sql);
}

@TimberLog(name = "is_user_authorized")
private static void isUserAuthorized(String userId){
  boolean isAuthorized = ...
  TimberLogger.logString("isAuthorized", isAuthorized);
}

@TimberLog(name = "get_sql_query")
private static String getSqlQuery(String[] args) {
  String sql = createSqlQueryFromArgs(args);
  TimberLogger.logText("sql", sql);

  validateSql(sql);
  return sql;
}

@TimberLog(name = "create_sql_query_from_args")
private static String createSqlQueryFromArgs(String[] args) {
  ...
}

@TimberLog(name = "validate_sql")
private static void validateSql(String sql) {
  ...
}

@TimberLog(name = "run_sql")
private static List<String> runSql(String sql){
  List<String> result = ...
  TimberLogger.logMetric("resultSize", result.size());
  return result;
}

T.java hosted with ❤ by GitHub

After running the app we can see the following tasks in Kibana:

Tasks representation in Kibana

Aside from the general properties (name, status, taskBegin, taskEnd, duration, etc…), each task holds these flow-related properties:

  • parentId — Direct parent task ID.
  • parentPath — An ordered list of all the task’s ancestors.
  • primaryId — ID to the first task in the task’s flow.
  • ctx.{PROPERTY_NAME} — All Context key-value properties that were assigned to any of the task ancestors.

Context properties could be helpful with variables that are used across the flow (like user ID) as you only need to assign them once and they will “trickle down” to all of the child tasks.

Using Timbermill — first steps

Prerequisites

  • Java application (Python client is coming)
  • An Elasticsearch cluster
  • Kibana (not mandatory but recommended for visualization)

Install

Add timbermill-client dependency. (Hosted in Maven central)

<dependency>
  <groupId>com.datorama.oss</groupId>
  <artifactId>timbermill-client</artifactId>
  <version>2.0.4</version>
</dependency>

pom.xml hosted with ❤ by GitHub

  • Create new Timbermill Pipe Config with Elasticsearch cluster’s URL and environment name.
LocalOutputPipeConfig config = new LocalOutputPipeConfig.Builder().url("https://localhost:9200").env("local").build();

pom.xml hosted with ❤ by GitHub

  • Bootstrap Timbermill
TimberLogger.bootstrap(new LocalOutputPipe(config));

hello.java hosted with ❤ by GitHub

Usage

Add @TimberLog annotation with task name before a method or constructor.

@TimberLog(name = "method_to_log")
public static void methodToLog(){
	 doSomething()
}

hello.java hosted with ❤ by GitHub

  • Add relevant task information inside the method using
    context/string/text/metric/info functions.
public void doSomething(){
	TimberLogger.logContext("key","context");
	TimberLogger.logText("key","This is a text");
	TimberLogger.logString("key","string");
	TimberLogger.logMetric("key",2);
	TimberLogger.logInfo("This is a regular log");
	TimberLogger.logInfo("This is also a regular log"); 
}

hello.java hosted with ❤ by GitHub

  • Run your code. Your tasks will be given unique IDs and will be written to the Elasticsearch cluster.
  • Configure Timbermill’s index pattern in Kibana.
  • Exit Timbermill when application is shutting down.
TimberLogger.exit();

hello.java hosted with ❤ by GitHub

  • That’s it! After running your application, you can start investigating your tasks in Kibana.

The @TimberLog annotation will ensure that tasks will open before the method is called and will be close after the method exits. A task will be closed either with status SUCCESS if it is exited regularly or with status ERROR if an exception was thrown. All the information that was added while the task was open will automatically be assigned to the appropriate task. Every other annotated method called while it was open will have this task’s taskId as its parentId.

Timbermill + Elasticsearch == ❤️

Timbermill uses Elasticsearch as its back end. Every task is indexed as an Elasticsearch document. Those tasks are indexed into a dedicated Timbermill index. Every piece of information added to a task is added as a document property.

Using Elasticsearch gives you the power of Kibana, with its querying, dashboarding and alerting abilities.

Timbermill + Open-source == ❤️❤️

As a company that relies heavily on open-source products, it is very important to us to contribute back to the community. That’s why we are releasing Timbermill (and Akita) as an open-source project for everybody to use and contribute.

Would really appreciate your comments and PRs! https://github.com/datorama/Timbermill

Related Observability Articles

View all