happy :)

Was very happy to see that my blog post was featured on druid.io website 🙂

Screen Shot 2018-07-30 at 3.49.26 PM

This is the link for the post.

Advertisements

java.lang.RuntimeException: native snappy library not available – druid ingestion

Was ingesting some data into druid using local batch mode & the ingestion failed with

2018-07-10T04:57:51,501 INFO [Thread-31] org.apache.hadoop.mapred.LocalJobRunner – map task executor complete.
2018-07-10T04:57:51,506 WARN [Thread-31] org.apache.hadoop.mapred.LocalJobRunner – job_local265070473_0001
java.lang.Exception: java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.
at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:489) ~[hadoop-mapreduce-client-common-2.6.0-cdh5.9.0.jar:?]
at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:549) [hadoop-mapreduce-client-common-2.6.0-cdh5.9.0.jar:?]
Caused by: java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.

To avoid the snappy compression… i disabled it with the following tuning config in the ingestion spec

"tuningConfig": {

                        "type": "hadoop",

                        "partitionsSpec": {

                                "type": "hashed",

                                "targetPartitionSize": 5000000

                        },

                        "jobProperties": {

                                 "mapreduce.framework.name" : "local",

                                 "mapreduce.map.output.compress":"false"

                                "mapreduce.job.classloader": "true",

                                "mapreduce.job.classloader.system.classes": "-javax.validation.,java.,javax.,org.apache.commons.logging.,org.apache.log4j.,org.apache.hadoop."
                                  .........
                                  .........
                        }

                }

 

Pushing druid metrics to Graphite Grafana with full metrics whitelist

This blog post is about configuring druid to send metrics to graphite/grafana.

kindly refer to steps @

https://github.com/jaihind213/druid_stuff/tree/master/druid_metrics_graphite

The Grafana dashboard templates are in above github link.

———————————————————————————————————————

after you have followed the steps above, restart druid and it should send stuff to graphite and grafana.

You should see log statements like

INFO [main] io.druid.emitter.graphite.GraphiteEmitter - Starting Graphite Emitter.

INFO [GraphiteEmitter-0] io.druid.emitter.graphite.GraphiteEmitter - trying to connect to graphite server

INFO [main] io.druid.initialization.Initialization - Loading extension [graphite-emitter] for class

Screen Shot 2018-06-11 at 7.52.12 PMScreen Shot 2018-06-11 at 7.51.58 PM

Screen Shot 2018-06-11 at 7.58.00 PM

Tip: you can set the frequency at which metrics are reported and also set the types of monitors you wish to have BASED ON THE RELEASE version you USE.

Important Tip: Start off with JVM monitor first, then slowly add. The sys monitor needs sigar jar. some monitors mite nite work immediately.

Screen Shot 2018-06-11 at 7.59.12 PM.png


 

Druid Indexing & Xerces Hell – Loader Constraint Violation

Recently my team mate Vihag came across an java.lang.Linkage error while doing druid indexing with CDH 5.10.2

It was good fun we must say finding out the reason.  hehe taking md5 of class files and comparing and finding out which jar the class is loaded from.

—————————————-

2018-05-11 11:03:04,848 FATAL [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster
java.lang.LinkageError: loader constraint violation: when resolving overridden method “org.apache.xerces.jaxp.DocumentBuilderImpl.newDocument()Lorg/w3c/dom/Document;” the class loader (instance of org/apache/hadoop/util/ApplicationClassLoader) of the current class, org/apache/xerces/jaxp/DocumentBuilderImpl, and its superclass loader (instance of <bootloader>), have different Class objects for the type org/w3c/dom/Document used in the signature
at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)
at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2541)
at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2503)
at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2409)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1233)
at org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
at org.apache.hadoop.mapreduce.TypeConverter.<clinit>(TypeConverter.java:62)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:481)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:469)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1579)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:469)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:391)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1537)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1534)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1467)
2018-05-11 11:03:04,851 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting with status 

—————————————-

Vihag has documented the resolution nicely here @ github

—————————————-

The crux of the matter was that The class ‘DocumentBuilderImpl‘ was present in Druid in the xercesImpl-2.9.1.jar and Yarn was loading it from xercesImpl-2.10.0.jar.

So we removed 2.9.1 jar from druid and replaced it with 2.10.0 jar

Also we noticed that 2.10 jar plays well with xml-apis-1.4.01.jar & not xml-apis-1.3.04.jar

Hope this helps you.

 

 

 

 

UnknownArchetype: The desired archetype does not exist

recently encountered this error while generating a project in my intellij.

[ERROR] BUILD FAILURE
[INFO] ————————————————————————
[INFO] : org.apache.maven.archetype.exception.UnknownArchetype: The desired archetype does not exist (org.apache.maven.archetypes:maven-archetype-webapp:RELEASE)
The desired archetype does not exist (org.apache.maven.archetypes:maven-archetype-webapp:RELEASE)

Solution: The archtypes are downloaded from the maven central repo, so make sure the repo is specified in your .m2/settings.xml

In the settings.xml,  under profiles->profile->repositories add the maven central repo url

      <profiles> <profile>  <repositories>

              <repository>

                <releases> <enabled>true</enabled> </releases>

                <snapshots><enabled>true</enabled></snapshots>

                <id>central</id>

                <url>https://repo1.maven.org/maven2</url>

            </repository>

        ……

     </repositories> </profile> </profiles>

Tip: Check if the url ‘https://repo1.maven.org/maven2‘ works first , else give a repo which works.