Skip to main content

A Tale of Two Log Forwarders: Logstash and FluentD (part 1.2)

  A couple of ways to install Logstash is as a service on the cloud virtual machine, as well as deployed as a Docker container.  For our example, let's deploy Logstash as a Docker container. Assuming the Docker engine in installed, we can pull it from the docker.elastic.co repository:
      docker pull docker.elastic.co/logstash/logstash
  You can execute the "docker image" command to verify the image was pulled.  Before we create a container from this image, we'll create a configuration file.
  It is possible to have multiple inputs on a single Logstash configuration file, but let us, for this example, have one configuration file  per input.
  vim tcp-to-elasticsearch.conf
input {
  tcp {
     port => 8080
  }
}
output {
  elasticsearch {
     hosts => ["10.0.0.41:9200"]
  }
  stdout {
     codec=>rubydebug
  }
}
  You may have noticed this configuration outputs to an Elasticsearch cluster as well as standard out.  If you're curious, you can easily make a configuration for http input out of this file by changing the word 'tcp' to 'http.'  Now that our configuration file is created, we're ready to start out container. When running the following Docker command, make sure you're in the same directory as the configuration file.
docker run -dit --name logstash-tcp --restart=always \
-p 8080:8080 -p 9600:9600 -v "$PWD":/config-dir \
-e http.host=127.0.0.1 logstash \
-f /config-dir/tcp-to-elasticsearch.conf
  Give it some time to start up, and then check the logs to verify:
     docker logs --tail 10

Popular posts from this blog

#processing @Microsoft #office #Excel files with @TheASF POI (part II)

...
     Apache POI's OPCPackage abstract class represents a container that can store multiple data objects.  It is central to the processing of Excel(*.xlsx) files.  We only need to use its static open method to process an InputStream instance.  Further, we can "read" these Excel files via the XSSFWorkbook class.  This class is a high level representation of a SpreadsheetML workbook.  From an XSSFWorkbook, we can get any existing XSSFSheets within the workbook.  Then, we can further subdivide any XSSFSheet into rows and analyze the cell data within the rows.  In general, given certain assumptions in the format of the Excel document, we can extract data as text  from a cell and perform any number of business processes.

     In the Java function code excerpt below, we assume we have an Excel(*.xlsx) file represented as an InputStream.

        @Override
    public Iterator<Row> apply(InputStream inputStream) {

        try(OPCPackage pkg = OPCPackage.open(…

More Guice Please!!!: Re-Learning Google's Agile Lightweight Dependency Injection Library (Part 1.1)

Google Guice is used as a lightweight dependency injection framework that further assists developers in modularizing their applications.  Google shared this very useful library with the development community in 2010 in between the Java SE 6 and Java SE 7 releases.  This library is used in some of Java’s (and now Scala’s) most prominent libraries and platforms such as the Simian Army platform shared by Netflix.
We will begin our discussion of Google Guice with its Module interface.  In the Guice developers’ own words, ‘A Guice-based application is ultimately composed of little more than a set of modules and some bootstrapping code.’  We will not be using this interface directly, but it gives us a very good context from which to start.  Instead, we will be extending the abstract class that implements it -- intuitively named AbstractModule.  
If you ever get a chance to look at the Module interface JavaDoc or source code, you’ll see a configure method taking a parameter of type Binder.  
Li…

Implementing @ApacheIgnite's cache store (part II)

Apache Ignite’s CacheStore interface is an API for cache persistence storage for read-through and write-through behavior.  When implementing this interface, you choose the type of key and value object alike -- similar to a map.  This follows the pattern established by the CacheLoader and CacheWriter interfaces CacheStore extends of the JSR107 specification.  In many cases, having a specific implementation for each method when implementing this interface may not be necessary, so Apache Ignite has a CacheStoreAdapter for this purpose.
Since Caches so closely resemble Maps, perhaps we should begin our discussion with a cache implementation that is essentially a HashMap store:
public class HashMapStore extends CloudStoreAdapter {
private final Map<Object, Object> map = new HashMap<>();
@Override public void loadCache(IgniteBiInClosure c, Object … args) {
for(Map.Entry e : map.entrySet()) { c.apply(e.getKey(), e.getValues()); }
@Override public Object load(Object key) { Return map.get(k…