What Makes Reggae, well, *Reggae*?

I love Reggae music. It’s my absolute favourite type of music. There’s something about it – it’s clearly the music of ordinary people, which gives it depth, and the rhythm is just addictive. You just cannot stand still when you hear it.

What’s so special about reggae rhythm? At first look, it’s just the regular 4/4 beat. But it’s somehow different at the same time. And even within reggae, the pop style of UB40 for example, is very different from the roots reggae of Bob Marley. This got me wondering.

Then, a few months ago, I saw the 2012 documentary ‘Marley’. It is an amazing mosaic of the life of an amazing man. And I understate it severely when I say ‘amazing’. It was available on Netflix for a while, but I see that it has now gone away, at least here in the USA. You can still buy it from Amazon, or watch it in low-resolution here. I highly recommend purchasing and keeping it, it’s that good.

From this documentary, I learnt about how reggae came to be, and how it is different. A typical 4/4 beat emphasizes the first beat the most, and the 3rd as well. Here’s a good example, Bohemian Rhapsody by the popular rock band Queen. You can clearly hear the 1-2-3-4 if you look for places where the drums start, as they do at 02:00 in the video.

In this case, the numeral in bold and italics is the strongest beat, and the numeral in bold alone is strong too. We could represent it like this pictorially:

The thicker and longer the line, the stronger the beat. You can imagine beating a drum with a severity equal to the weight and length of the arrow.

In reggae, it is the off beat (#2) that is emphasized, as well as two next beats. The ‘2’ and ‘4’ are typically guitar, and the ‘3’ is the base drum. Now the beat is 1-234. Bunny Livingston, one of the original Wailers, explains it really well in this short clip from ‘Marley’:

So in our pictorial representation, this is what the reggae beat would look like:

This reggae beat was discovered most likely by accident. Bob Andy, the recording artist of Studio 1 where Bob Marley and the Wailers did many of their early recordings, said that there was some equipment they had brought from the USA, which they tried one day. It was a tape delay, and it caused an echo of the guitar strum, which led to the ‘chekke’ sound typical of the reggae guitar. Carlton Davis, session drummer of the Wailers talks about this sound in ‘Marley’. And  you hear it clearly in the Bob Marley song that follows as well:

The drum beat comes on the ‘3’ and is usually quite strong, which is why is is 1-234. The ‘chekke’ comes on the ‘2’ and often on the ‘4’ as well. And the bass guitar also comes in on the ‘2’ and ‘4’. Aston Barrett, bass guitarist of the Wailers explains it here:

This is what makes reggae unique. This idea of guitar and drums is really important. If you switch the two, drums on the ‘2’ and ‘4’ and guitar on the ‘3’, it sounds totally different, as you can hear in this song from the 70s by The Police:

So, now that you understand the reggae beat, let’s leave you with a very nice reggae song from Steel Pulse. Enjoy!

Why Machine or Deep Learning, and How to Choose

Two incidents happened recently, which prompted me to write this article. First, I was writing a proposal for a potential partner for my startup, on how we could partner with them to bring AI into their products and applications. They then turned around and asked us why they should want to develop AI applications in the first place.

Second, I was chatting about my startup with an investor friend. After he understood that we did a platform to accelerate the adoption of AI techniques by enterprises, he told me: “Shekhar, you are two steps removed from the problem. Enterprises today don’t even understand what AI is, how it differs from traditional techniques. They need to get this first.”

Of course I knew this, and had been talking about it, but I realized I needed to put it down. Now, a real comprehensive reasoning will be customized to the specific customer and the problem. I do have some general points however and here they are.

Analytics has been and can be done without AI of course. In many cases, the non-AI approach can be severely limiting, for the following reasons. This is not intended to be an exhaustive list, it only contains some of the main reasons why many enterprises are moving to AI models for data analysis.

  • The traditional data analytics approach is reactive. Data is collected from various sources, and analyzed with tools that display it in various ways, with dashboards, graphs, logs, etc. The patterns that these tools look for are based on existing knowledge. This is akin to monitoring, and the response to triggers requires human intervention anytime something new is seen. With AI, the responses are proactive. AI is used to predict the triggers, and automatically apply the remediation, hence it is a further level of automation with a significantly reduced need for human intervention.
  • AI makes prediction decisions based on past and current input of data into the model. The model learns from the data input, and the more realistic data is used, the better it learns. This then improves the predictions and decisions the model makes – so, a prediction made in the past would generally be different than a prediction made later, even with the same data input, because in the time in between, the model has learnt more and has become more accurate in its predictions. In the case of non-AI predictive models, the same data input always results in the same predictions irrespective of when this happens, since the model is not learning.
  • Traditional models were designed to work with less data. Nowadays, every organization collects large amounts of data that were not possible before. The newest deep learning algorithms perform better with an abundance of data which was not the case before. The amount of data used for these techniques would often be overwhelming for traditional techniques.
  • With unsupervised learning, the newer AI techniques can detect patterns in the data that would not be obvious to humans. For example, recently it was determined that the plague or Black Death in 14th century Europe was more likely spread by humans via lice, rather than rats as was earlier thought. This was determined by simulating outbreaks in various cities with different models (rats, airborne, fleas/lice) and finding out which fit best. If the data had been fed to a deep learning model with unsupervised learning, it would have discovered patterns that led to the conclusion faster, without the need for simulations.

Within the broad category of AI, there are multiple categories of learning. The latest, and the one most cited these days is Deep Learning. The main difference between deep learning and traditional machine learning is in the classification of features required to solve the problem. With traditional machine learning, the feature classification is done manually. With deep learning, the system figures out which features are important for making the prediction and automatically evaluates them for making decisions.

For example, if our problem is weather prediction, and we have collected data on various weather phenomena from the past. To be more specific, let’s say we want to predict the likelihood of a natural fire in wooded areas. We may define the features required for this prediction are: humidity, temperature, the density of trees in a given area, among others. For traditional machine learning, we would need to create algorithms that predict the possibility of a fire as a function of these features. A deep learning algorithm would automatically figure out which features are needed to make a prediction. Suppose we added a feature such as the population of rabbits in the given area. Then, the traditional machine learning model would use this data for its prediction if it were programmed as such, however, the deep learning model would recognize from its training that the rabbit data is not relevant to the prediction and would not evaluate it.

Deep learning also works better with the amount of data fed to it for training. Traditional models taper off after a while, as seen from the graph below. Hence this is a good technique to use in cases where a lot of training data is available.

Some good examples of where to use deep learning is in threat detection where there are many samples of attacks (such as network intrusions) or predictive maintenance for IoT devices, where a lot of failure data is available (Hitachi uses this on their remote earthwork machines).

This also means that in the case where very little training data is available, deep learning is not a very good technique. An example is spearphishing attacks, where the number of successful attacks is very small, hence there is not much training data available for deep learning models, and the number of false positives is so high that this is not a good technique for predicting these specific types of attacks (see this paper on spearphishing).

Here is a cheat sheet for deciding what type of machine learning technique to use for a given problem. The original has links for each individual technique, to explore more details on that technique, and can be found here.

Lunar Rainbow at Victoria Falls

Victoria Falls is by itself an amazing wonder. But about three or four times a  year, an amazing phenomenon occurs. During the nights of the full moon, there is enough light to create a lunar rainbow over the falls. This happens in the months of June and July, when it is winter in the southern hemisphere.

At the same time, the water over the falls is at its maximum. In the summer months, there is less water, with the Zimbabwe side seeing the most, and the Zambia side mostly dry. But in the winter, there is a single sheet of water all across the falls.

The Avani Livingstone Resort on the Zambia side is probably the best place to see the falls from. It is right next to the falls, and it takes about 5 minutes to get to the view sites by foot on the falls trails.