Friday, July 31, 2015

Becoming a Big Data Nerd!

     Security was where I always thought I would see myself.  My undergraduate career was spent in security classes, learning about just about an topic a security professional should.  The hard reality of the education was that without the experience, no one is going to hire you as a security professional.  I know many would say this is true, but generally it is.  Of all the positions I applied for when I finished school, only around three that were security related actually got back to me.  I got through the process for two of them, but in the end they fell through.

     After that it was crunch time and that meant taking whatever job I could to start paying my student loans.  That's when I began to learn that you have to start somewhere and once you start gaining that professional experience then you can move to the area you want.  When I speak to those new to the industry, I tend to use this analogy:  "you can secure a technology if you have no foundation in the setup and operation of said technology."

    Four years into my career my ship had finally arrived!  I was hired to perform security related work.  For the first year I was reviewing network designs, advising on network changes (to make sure they held up to security standards), auditing platforms, and keep the industry in which I work informed of the newest security threats.  Things began to change a little bit when actual montioring was made a priority for the unit I am in.  It was always part of the unit, but it was simplistic at the time (Nagios for website monitoring and Netflow Analyzer for reviewing Netflow).  A member of my unit came up with the idea of centralizing everything we monitor (with some additions) into one location.

     To that end we moved into the ELK stack (Elasticsearch, Logstash, and Kibana).  I had never heard of it, but management brought in and sent us to training.  From there, with a lot of trial and error, we got the system up and running.  I spent a great deal of time getting ELK to stay up longer then a few days.  To day, my cluster has been up for over 100 days (I accidentally stopped it) and we are handling 63 events per second.

     With that in place, the same team member then suggested that we move to more data analytics and start looking into Hadoop.  He had used applications built on it and felt it would definitely be worthwhile for our units mission.  Obviously, we had to make it a little more simple because building a Hadoop cluster is tough when you have to figure out which components to use and if they work together.  My boss charged me with finding a distribution to use.  In my research I found that we had Cloudera or Hortonworks.  Cloudera seems to have much more information on the web, but they charge for any of the features you'd really want to us.  Hortonworks, in turn, gives away everything and charges for support along with training.

    From there I was instructed to find training.  I found a really good training center (/training/etc if you are looking for quality training from awesome instructors).  I was sent for the Operations course designed by Hortonworks.  My boss and another coworker attended the Data Analyst course.  This past week my boss and I completed the Data Science course.

    About a month ago I started reading more and more about data analytics.  While security is important, detection and analysis of data is the bread and butter.  I decided that after this training course, if I was still interested, I would move down the analytics road.  The course was amazing and while a 10000 foot view I knew where I should head.  Experience, as with anything in technology, is important and on a daily basis I work with the tools of Big Data.  I run a Hadoop cluster and analyze various system/network data in Elasticsearch.  I'll be applying for a Masters in Analytics and hopefully in two years I will move into Data Engineering.

    The morale of the story?  The journey you start doesn't always have the end point you think.

Tuesday, July 7, 2015

eLearnSecurity - Web Application Penetration Testing Certification

Today I received an email about getting 40% off the course if I registered by 7/10/15.  I jumped on it and purchased the Elite version.  Thus starting tonight I will begin posting reviews of each module for the course as I complete them.  I need to wrap this up in two months as I will beginning grad school in August.

Monday, June 29, 2015

Hadoop Had a Case of the Mondays!

"Let me ask you something. When you come in on Monday and you're not feeling real well, does anyone ever say to you, "Sounds like someone has a case of the Mondays?" " - Peter Gibbons - Office Space

As always, the weekend goes way too fast and once again we are back in the office.  Typically, either Friday or Monday, I will run the updates for the servers that I manage.  During this time I will review to make sure that my Hadoop cluster is running properly.  This morning I noted that the Ambari Metric Collector service was not running.  All other services appeared to be fine.

Hoping for a one off I went ahead and tried to start the service via Ambari.  As you run HDP more and more you start to get a feel for when something is just not going to work.  In this case the service started at 9% and I knew that it was not going to start.  I waited for the time out and sure enough it didn't start.  Everything else appeared to be fine so I went ahead and put the server in maintenance mode.  From there I went and rebooted the server.

Things went from bad to worst.  Now I was just getting the heartbeat lost as a status for half the services and the other half were showing as stopped (better then heartbeat lost because at least you can try to start them).  When I tried to start all the services I would get an error about being unable to talk to the Zookeeper service.  Reboot again and the same issue was continuing.  Finally, I said to myself, let's shut the server down and then start it.  I could only help and think about Jurassic Park ("Hold on to your butts!").  Bring the server back up and everything was in the red "Stopped" status.  Hit "Host Actions" button and selected "Start All Components".  Bam everything goes to green "Started".

Morale of the story?  Shutting down is probably your best option when dealing with services not coming up.

Tuesday, June 9, 2015

Hadoop and Kerberos

Was doing some testing today and I hit a major snag.  Even using the admin principal in Kerberos was not allowing me to make a directory.  One of those situations where it is so secure that you can't access it and that just does no one any good.  After about 20 minutes I finally figured out how to generate a key for the hdfs user and then make directories.  Do the following:

su - hdfs

klist  <-----this will probably tell you that your key expired awhile ago

kinit -kt /etc/security/keytabs/hdfs.headless.keytab hdfs@<your realm>

After you enter the above you should be presented with a blank prompt.  From there you should be able to enter whatever hadoop command you were looking to run.

Look on the NameNode for keytab files.

Tuesday, June 2, 2015

Officially a CISSP

On April 9th I took one of the hardests test of my life.  You hear stories about the CISSP, but nothing can quite prepare you for it.  A solid 60 days of studying and it all came down to the four hours and some change I used for the exam.  I was pretty sure I failed and something no one had told me was the computer screen does not show if you passed or failed.  All it said was pick up your score report at the front and that cemented for me that I did not pass.  Of course I got out there to find out I had passed!  The following week I submitted all of my paperwork and the wait begun.  This afternoon I got the official email stating I was now a Certified Information Systems Security Professional!  Works out perfectly as I go on vacation tomorrow and will definitely have some fun!

Friday, May 29, 2015

Hadoop - Deploy Your Own Infrastructure

Today cemented that when dealing with Hadoop it is always best to deploy your own infrastructure.  When I say infrastructure I mean items such as LDAP and Kerberos.  The automated items available typically deal with the sandbox (at least with Hortonworks) and that isn't helpful to your when dealing with your production environment.  So if your are in an environment with no infrastructure, take the time and deploy it separately from Hadoop.  You'll save yourself a lot of headaches in the end.

Thursday, May 28, 2015

Hadoop and Kerberos

So I emailed the engineer at Hortonworks who wrote up the tutorial I was following to see if there was a work around to utilize FreeIPA.  Seems, as of right now, there isn't.  But they are aiming for a July release for Ambari 2.1.  But the catch is you would have to manually create the principals if you are utilizing FreeIPA.  Thus you really need to way your options:

1.  FreeIPA - easy setup and a very nice system, but lose the automation in Ambari to deploy Kerberos

2.  Kerberos - not terribly hard to setup, but not as easy as FreeIPA (plus you'd have to setup LDAP on your own) - but you get to use the automation for Kerebos in Ambari and that might be one less headache to worry about