Tuesday, July 7, 2015

eLearnSecurity - Web Application Penetration Testing Certification

Today I received an email about getting 40% off the course if I registered by 7/10/15.  I jumped on it and purchased the Elite version.  Thus starting tonight I will begin posting reviews of each module for the course as I complete them.  I need to wrap this up in two months as I will beginning grad school in August.

Monday, June 29, 2015

Hadoop Had a Case of the Mondays!

"Let me ask you something. When you come in on Monday and you're not feeling real well, does anyone ever say to you, "Sounds like someone has a case of the Mondays?" " - Peter Gibbons - Office Space

As always, the weekend goes way too fast and once again we are back in the office.  Typically, either Friday or Monday, I will run the updates for the servers that I manage.  During this time I will review to make sure that my Hadoop cluster is running properly.  This morning I noted that the Ambari Metric Collector service was not running.  All other services appeared to be fine.

Hoping for a one off I went ahead and tried to start the service via Ambari.  As you run HDP more and more you start to get a feel for when something is just not going to work.  In this case the service started at 9% and I knew that it was not going to start.  I waited for the time out and sure enough it didn't start.  Everything else appeared to be fine so I went ahead and put the server in maintenance mode.  From there I went and rebooted the server.

Things went from bad to worst.  Now I was just getting the heartbeat lost as a status for half the services and the other half were showing as stopped (better then heartbeat lost because at least you can try to start them).  When I tried to start all the services I would get an error about being unable to talk to the Zookeeper service.  Reboot again and the same issue was continuing.  Finally, I said to myself, let's shut the server down and then start it.  I could only help and think about Jurassic Park ("Hold on to your butts!").  Bring the server back up and everything was in the red "Stopped" status.  Hit "Host Actions" button and selected "Start All Components".  Bam everything goes to green "Started".

Morale of the story?  Shutting down is probably your best option when dealing with services not coming up.

Tuesday, June 9, 2015

Hadoop and Kerberos

Was doing some testing today and I hit a major snag.  Even using the admin principal in Kerberos was not allowing me to make a directory.  One of those situations where it is so secure that you can't access it and that just does no one any good.  After about 20 minutes I finally figured out how to generate a key for the hdfs user and then make directories.  Do the following:

su - hdfs

klist  <-----this will probably tell you that your key expired awhile ago

kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@<your realm>

After you enter the above you should be presented with a blank prompt.  From there you should be able to enter whatever hadoop command you were looking to run.

Look on the NameNode for keytab files.

Tuesday, June 2, 2015

Officially a CISSP

On April 9th I took one of the hardests test of my life.  You hear stories about the CISSP, but nothing can quite prepare you for it.  A solid 60 days of studying and it all came down to the four hours and some change I used for the exam.  I was pretty sure I failed and something no one had told me was the computer screen does not show if you passed or failed.  All it said was pick up your score report at the front and that cemented for me that I did not pass.  Of course I got out there to find out I had passed!  The following week I submitted all of my paperwork and the wait begun.  This afternoon I got the official email stating I was now a Certified Information Systems Security Professional!  Works out perfectly as I go on vacation tomorrow and will definitely have some fun!

Friday, May 29, 2015

Hadoop - Deploy Your Own Infrastructure

Today cemented that when dealing with Hadoop it is always best to deploy your own infrastructure.  When I say infrastructure I mean items such as LDAP and Kerberos.  The automated items available typically deal with the sandbox (at least with Hortonworks) and that isn't helpful to your when dealing with your production environment.  So if your are in an environment with no infrastructure, take the time and deploy it separately from Hadoop.  You'll save yourself a lot of headaches in the end.

Thursday, May 28, 2015

Hadoop and Kerberos

So I emailed the engineer at Hortonworks who wrote up the tutorial I was following to see if there was a work around to utilize FreeIPA.  Seems, as of right now, there isn't.  But they are aiming for a July release for Ambari 2.1.  But the catch is you would have to manually create the principals if you are utilizing FreeIPA.  Thus you really need to way your options:

1.  FreeIPA - easy setup and a very nice system, but lose the automation in Ambari to deploy Kerberos

2.  Kerberos - not terribly hard to setup, but not as easy as FreeIPA (plus you'd have to setup LDAP on your own) - but you get to use the automation for Kerebos in Ambari and that might be one less headache to worry about

Tuesday, May 26, 2015

Hortonworks HDP 2.2.4 - Securing with Kerberos

As with most technologies, security was an after thought with Hadoop.  During it's creation it was thought that a solid perimeter defense was enough to securing because hey if you're inside you should have access to everything right?  Yup that lasted the first few years until companies finally realized if we want to use this in a regulated environment we're going to need somethings.  When I took my training course for HDP, Kerberos was mentioned.  Now through no fault of my instructor we didn't really cover it.  Why?  Because if you don't have the infrastructure in place your lab is basically not going to work.

What infrastructure do you ask?  Mainly:


Yes you could probably get away without the DNS, but the time sync is vital to Kerberos.  Thus when I deployed my cluster I made sure I had my infrastructure in place.  DNS?  Check.  NTP?  Check.  Along with all the other configurations you should have in place prior to deploying.  Last week, after much research (not enough research as I would learn) I was ready to secure my cluster.  As of now I have nothing in it and I figured before pumping data in I should have the security in place.

Now what items was I going to deploy?  They are as follows:

Kerberos - this explains itself
LDAP - User accounts are usually important
Apache Ranger - an awesome tool that allows you to restrict access and audit who touches what within Hadoop.  It works with HDFS, Hive, HBase, Storm, and Spark!  In Hive and HBase you can restrict down to the column level if you so desire.

Hortonworks provided  a nice tutorial on rolling everything out and the engineer recommended using FreeIPA.  I followed along and deployed FreeIPA without any issues.  Of course, as it always happens with Hadoops, all the problems kicked in when trying to enable Kerberos.  First, it told my it couldn't connect with the server that was hosting Ambari.  As I searched the web I found out that you have to register the Ambari host with the cluster and then you would be good to go.  If you run into this issue, go to the the Host page and click on "Add Host".  Follow the normal prompts, but when asked what you'd like to install uncheck everything.  You'll probably get some errors, you can ignore them and keep going.  Once you complete it, go ahead and delete the host (it will still be registered).

Once I got past that, it started to deploy the Kerberos client and again failed.  I checked the error messages and it stated it couldn't setup the principals.  Back to Google and I tried a number of things.  Four hours later I find out the following.  Ambari 2.0 was changed to make it easier to deploy Kerberos.  This involved changing the process where the administrator would have to manually create the keytabs.  Unfortunately, FreeIPA does not allow the use of kadmin, which is required for Ambari to deploy Kerberos automatedly.  So if you are setting up Kerberos and utilizings the latest HDP version, just be aware that you will be unable to use FreeIPA.  The people who work on Ambari do plan on fixing the issue in 2.1, but there doesn't appear to be a date for that release.