Categories
Day To Day

Eat Smart, Eat More – My Transformation – Part 2

A thought I’ve always pushed to the back of my head is the Weighing of my food. Chris made  nutrition a core focus and rightfully so, I have learnt that through my transformation getting into shape is a 80:20 ratio between nutrition and training. Whether the reason was the extra effort involved, not wanting any constraint on my food habits or just plain laziness; my shred came down to me turning towards food scales after all and starting a food diary.

Before we unfolded the nitty gritty of training, I was put on a two-week trial on the calorie tracking app MyFitnessPal.  The aim of the trial period was for Chris and I to gauge an understanding of my current eating trends; I ate as normal without any deficit or focus on macro nutrients.  A Calorie deficit target was then calculated based upon this to work against – 1,800 calories a day and consuming 120g protein daily.

Portion control and protein intake were both key as I was originally overconsuming by quite a margin. Hence, I was put on a High Protein- Low Carb – Low Fat diet, which was even more challenging than usual because I am a vegetarian. It was difficult but not impossible and I was ready more than ever before to tackle this challenge.

Now Going back to the food scale, I was in search for something not too expensive or overly complicated and did the job. I highly recommend the scale which I still use today on the daily, after exploring Amazon I settled for the food scale below:

Exzact Electronic Kitchen Scale – EX4350 – £12.99 off Amazon. https://www.amazon.co.uk/gp/product/B00Z7VLGL4/ref=ppx_yo_dt_b_search_asin_title?ie=UTF8&psc=1  

My relationship with food has always been difficult since a young age – as most would say “comfort eating”, but now it was time to fix this mess. I realised that my diet was dairy heavy,  I then decided to take the drastic decision to go Vegan. Chris gave me a guideline on my macronutrient breakdown. Investigating what foods to eat to fit my nutrition was my job. I had already cut out processed sugars, chocolates and snacking from my diet; all these were bad habits I had to overcome. I was eating 5 meals a day: Breakfast, mid-morning snack, lunch and dinner usually a post workout snack.

Nutrition

My meal breakdown was a combination of each day of the week:

Breakfast (Either)

  • Avocado and Mozzarella on Toast
  • Bowl of Oats and Soya Milk with sliced Banana pieces

Mid-Morning Snack (Either)

  • Banana & Pear
  • Protein Shake
  • Fruit

Lunch (Either)

  • Tofu Salad
  • Red Lentil Protein Pasta
  • Khaman Dhokla (Gujarati Rice Cakes)
  • Oat Pancakes
  • Morrisons Protein Noodles
  • Lentil Curry & Chapatis (x2)

Dinner (Either)

  • Broccoli & Beans
  • Mixed Beans Mash
  • Sweet Potato Mash
  • Pasta & Protein Shake

Post-Workout

  • Pineapple & Protein Shake

Vegan Protein Shake

In the past, I have tired dozens of protein shake which taste like chalk or taste good but don’t have a substantial nutritional value. I finally discovered what I still believe to be the best protein shake on the market (in terms of how much protein per serving) – Form Nutrition’s Performance Protein.

This amazing plant-based pea protein shake gives a massive 30g of protein per serving (2 Scoops), comes in 3 flavours – Vanilla, Tiramisu and Chocolate Peanut. I have tried all of them but Vanilla comes on top for me as my personal favourite. It isn’t only very nutritious but tastes amazing too – a must try!

Protein Noodles

These are hard to get a hold of as they are a sell out product pretty much all the time. However, at the times I have managed to get my hands on them they have proved to be a great protein source and taste delicious. And, I could only find these in Morrisons and Tesco’:

https://groceries.morrisons.com/webshop/product/Morrisons-Market-St-Protein-Noodles/395701011

https://www.tesco.com/groceries/en-GB/products/299847531

Lentil Curry

When it comes down to lentils, there are many different types available as dry goods on the market today. Fortunately, this works in my favour as I am a Gujarati and they fit perfectly into our cuisine so it wasn’t anything new to me. I’ve grown up eating lentil curry all my life. Here are my personal favourites:

  • Split Chickpeas (brown) – Chana Dal
  • Black Gram Lentils – Urad Dal
  • Chickpeas (White) – Kabuli Chana
  • Lentil, Yellow Petite – Mung Dal

The beauty with lentils is how adaptable they are with food dishes. You can have them with chapattis, Pitta Bread, Rice, in a Salad or even on their own. Simply Fabulous. Low in calories, super tasty, high in protein and quick & easy to make.

This perfectly leads me on to my concluding notes on this second part out of three blogs – Portion Size.

Portion Sizes

Initially, when I started with my food diary portion size was a big problem because each food is weighed differently.

The measurement types I saw: in cups, grams, packs and pieces. After much research, I came to understand that different types of foods are measured differently in cups i.e. 1 cup of a liquid substance is not the same as 1 cup of dry goods. I used 1 cup to equate 128g of food across the board. I used this as a ball park on all the foods I ate that were measured in cups. For things like Pasta I would take the raw weight before boiling. Curries were harder as there are a lot of things that go into the final product so I would take a 128 gram weigh in of the overall curry and eat against this measure.

 For most foods I made use of the bar scanner feature that MyFitnessPal offers. Life Saver!

Sticking to 1,800 wasn’t too bad after all. Being in a deficit is just another challenge you have to overcome to reach your goal. Soon enough you will learn that if you eat smart, you can eat more. Healthy food isn’t boring if you make it fun. I think I just proved this – look how happy I look after my shred:


This picture is from my last session with Chris. After 11 months of hard work, I had lost 22kg and 22% body fat. It was now time for THE BULK.
Categories
Day To Day

Flab To Fit – My Transformation – Part 1

A Dark Beginning

It’s been quite some time; you know when life just happens and seems to takeover.

I’ve found it difficult to stay on track last year with my blogging; my sister’s wedding prep, a job promotion, and focusing on my passion: Fitness. That’s exactly what this blog is about – my transformation (as a trilogy).

As I write this, I am recovering from a dislocated patellar or kneecap for those whom don’t know. I am trying my best to see this in a positive light. Maybe this injury is a blessing in disguise; I can use it to re-connect with my love for writing?

One thing is for sure.  Avoiding focusing on the long-term goal; keep it at the forefront of your mind but instead break it down to multiple small short-term goals. When I started my transformation 29 months ago, little did I know, I would be sitting here writing this. I had a long-term goal in mind; to be as ripped as possible.

I soon realised on my journey that my overall goal could be broken down into periodical phases creating small milestones. But first and foremost, it’s all a mind game; you need to conquer your mind before you can conquer your body. REMEMBER: Change starts from within, before making its way outwards. Let’s talk about how it all started…

It’s true when they say you need to reach absolute rock bottom before you can hit your peak. You need an awakening. My awakening came in September 2017. I was at my absolute lowest; unhealthy, overweight, unfit and depressed. I was in a place where I had never gone before: complete darkness. This was my wake-up call, I knew it was now or never. I could go one of two ways: the first – do nothing and continue going down this dark rabbit hole or two – get my ass up, bust some iron in the gym and get into shape. I chose the latter (obviously!) and I am damn well glad I did! – Today I’m in the best shape of my life, fitter than ever (well not in this exact moment but you know what I mean!)  and happiest – both mentally & emotionally.

September 2017 – My Starting Point

I mean don’t get me wrong I was an active lad back at Uni, trained regularly but I knew there was something missing – I lacked direction.  This is why I felt it was important to have some professional guidance. If I had to be completely honest, I don’t think my transformation would have been possible without the guidance of my personal trainer and nutritionist Chris Mears. How did I find him? A recommendation from a friend to him as they were also training with him and the strength of word of mouth. It came at the perfect time, a bit of luck and timing did it for me. I made the decision to contact Chris and get my consultation booked in. I still remember, my first personal training session happened on Wednesday 27th September 2017 at what was then called FunctionFit (now known as Chris Mears Health & Fitness)

The Reality Check

37% body fat. 97kg. These were the results of my first weigh in. Rather shocking. I hadn’t realized how much I had spiraled out of control, my bad eating habits and lazy attitude. It was a shock, a big one. But a much needed one. This built the foundation to my nutritional programme; High Protein, Low Carb and Low Fat diet. Being Vegetarian bumped up the challenge that much more, with a target of 120g of protein every day, the initial 2 weeks period was an experiment. I was doing things I never thought I would; weighing my food and tracking my calories in MyFitnessPal. I soon came to realise that fitness is not just a choice it is a lifestyle. I am not going to say it was easy, the first month was tough, not only getting used to the routine but the looks of shock from my family as my every meal turned into numbers.

The reality check was that I was overconsuming an enormous amount of calories; to make up for which I needed to create a deficit big enough to burn into my bodies fat stores whilst training. But, isn’t this what personal trainers are for? Chris did the maths and set me on a cap of 1,500 calories a day with a protein intake of 120g. But the interesting part came when I had to explore WHAT foods to eat and what NOT to eat. It took me a good month before I could hold my ground with a proper meal prep timetable – Dairy was a food group that was high in my diet (and also a big reason to my over consumption) so I completely eliminated it. I went Vegan. Some may say this is a bit extreme, but for me, extreme or not, I had one goal and only one goal – Weight Loss. Just something to note here. Chris DID NOT restrict me. I restricted myself. After my reality check, I was ready to go to any extent to pursue my goal – I agreed on two personal training sessions a week with Chris followed by a rigourous food plan and gym sessions. It truly was No Pain No Gain.

As I mentioned before, one of the key points I made earlier was that of conquering my mind before my body. I had to mentally prepare myself; I was feeling very hungry and hormonally I was getting really grumpy. The shortage of food was having a big impact on my body, but I was only feeling this way because I was letting myself. I had jumped into the deep end without conquering myself mentally first; I was made aware of this during my consultation, but I didn’t realise how much of an impact it was going to have. The first month was the biggest struggle, continuously battling my mind – I wanted to quit but always asked myself why I started which kept me going. I remembered what Chris always told me ‘I can only guide you, but you have to help yourself by following through’. On another note, I was paying a price for this, I can’t just now let that go to waste (money wasn’t the most important factor by the way, but something to consider as well).

So, here I was, a couple months into my transformation. Vegetarian to Vegan. It was time to now begin my Shred.

Categories
Day To Day

Loughborough to Lake District…and more!

It’s been a while guys. I haven’t blogged in a long time, you know summer plans and then not having time. Now that things are less busy, it’s catch up time.
There have been a fair number of shenanigans that I have been up to during this summer, and I am hoping time allows me gradually to blog about each of them. Today however, I am going to be talking about my visit to the Lake District.
This was in the making for about 5 years (if I can remember correctly). My bunch of uni friends (this very specific group) – you know how we have different groups of friends, one from our lectures, another from a society we may have joined, then another from whom we may have shared accommodation with and You get the idea.
This particular group was a mix of lectures and dance group. Yes, I used to take part in dance for charity balls when I was back at university, at the time it was all fun and games but what I didn’t know was that I would make friends for a lifetime as a result. I’m not going to get into what dance and why etc, but it was extremely well worth it now that I look back at it.
The great thing about this reunion was not only did we get to spend an entire weekend (yes weekend) this feels super special when you are working 9-5 seven days a week – I am sure many of you can relate. But also, just being able to see how each of us have developed both professionally and personally since we left university. Exchanging conversations about our jobs, the pros and cons, the dos the don’ts, what’s worked out, what hasn’t, current projects, employees, managers, work ethic, and what’s to come! Staying up till late night just indulging in food and games and chatting about everything and anything was truly an experience in itself. Something us working people don’t actually get to do much as we are so stuck in our daily routine, we become habituated to it.
Just getting the damn thing organised was a big surprise itself, I think it got to a ‘do it now or never’ sort of situation and therefore we went for the former option. As techie as I am, it was the first time I had actually used an AirBnB (I didn’t book it my friend did) but I got a full understanding of what it is and how it works. (If you are unaware of what AirBnB is then have a read on Google :P). This was really cool, the place itself was decent, a beautiful lodge in Lancashire.
We arrived there Friday night, and the plans were as follows:
• Chill Friday night with drinks (I don’t drink but they did), games and catch up
• Saturday – Spent the day at Windermere on a 4-hour boat ride then go out to eat
• Sunday – Leave at 1pm and go to Northern Soul Grilled Cheese in Manchester to eat. Spontaneous trip to Peak District on way home.
The boat ride on Lake Windermere was surprisingly good. I was in two minds, but it turned out to be a gem of an idea. I was captaining the boat, and this was awesome! We just chilled in the middle of the lake for 2.5 hours, as the world cup was also on at the same time, we had the game up on the iPad. What more England were playing! Never did I know I would be tuning into an England World Cup match in the middle of Lake Windermere on a boat with a group of friends. What’s the word – LIT! On the way back from the boat trip we went to have an awesome dinner in a restaurant nearby and then just had to have Ice Cream (which melted because of the Sun anyway lol).
Saturday night turned out to be even better. We all chilled on the lakeside nearby, just talking about life and each of our upcoming plans and conversed in the marriage talk (one of our friends who was supposed to be attending couldn’t as he was back in India as he was getting engaged) so he was the talk of topic. The views here were mad – enjoying them with these lot was the icing on the cake. We were out late enough that we got a glimpse of the stars in pitch black darkness. This was truly bliss – embracing nature. Just being able to absorb what I was seeing around me was truly remarkable, it made the wait and weekend all very worth it.
It came around to Sunday, and none of us wanted to leave. We were having so much fun that we had lost track of time. To stick around longer, we made the plan to go to Manchester to the very well known ‘Northern Soul Grilled Cheese’. My friends had tried it and they couldn’t recommend it enough, so I thought yes, may as well see what all the hype is about. And my god, were they the best grilled cheese sandwiches I have ever had. It was surreal, too many good things were happening consecutively – spending time with best buddies, amazing views and awesome food! Getting back to the point, if you are around Manchester anytime, then do visit the Northern Soul Grilled Cheese – Two Thumbs Up.
We parted ways once we left Manchester, each of us going to our designated directions. On the way back home, we made a spontaneous trip to Peak District and this was even more beautiful than the Lake District. Just Phenomenal. We didn’t stay for long at all, just drove through and stopped for some snaps. It was at this moment, we decided that our next trip will be in Peak District (whenever that is lol).
All in all, it was an amazing weekend. Too many good things and truly very districtful!
Categories
Tech/IT Uncategorized

How to edit an incorrect commit message in Git

Note: VSTS = Visual Studio Team Services.

Sometimes in git, we end up mistakenly writing an incorrect commit message (especially if working on multiple projects). Now, I know that there are lots of sources online that walk through how to fix incorrect commit messages, but personally having experienced this, it can take time to search around and put all the relevant steps together. So, I decided to create my own walk through guide on how I managed to correct an incorrect git commit message. Hoping this comes in use for anyone who may stumble across such a situaton:

  1. First checkout a temp branch –
    • git checkout -b temp
  2. On temp branch, reset –hard to a commit that you want to change it’s message. (To check the commit number login to either VSTS or Github and track commits under the code branch). Take the commit number to be 946992 for example –
    • git reset –hard 946992
    • You should now get a message similar to: HEAD is now at 94992 <’old commit m’>
  3. Use amend to change the message –
    • git commit –amend -m “<new_message>”
  4. This is an optional step. If the commit being edited is the latest commit then this step can be skipped. However, if the commit has had other commits after it then cherry-pick all the proceeding commits ahead of 946992 from master to temp and commit them. Use amend if you want to change their message as well:
    • git cherry-pick<9143a9>
    • git commit –amend -m <”new m”>
    • git cherry-pick <last commit number>
    • git commit –amend -m <”new m”>
  5. Now, force push the temp branch to remote –
    • git push –force origin temp:master
    • At this point if you get the following error: ![remote rejected] temp -> master ( … You need the Git ‘Force Push’ … ) this means that either VSTS or Github have locked Force command permissions for you. If this is the case, you need to go into either VST or Github and assign yourself permissions to “Force push (rewrite history, delete branches and tags)”. You may have to ask somebody else ‘Directory /owner’ to assign the rights to you, then try again.
  6. Once done, check out of your temp branch –
    • git checkout temp
  7. Delete Master branch locally (make sure it is a capital D)
    • git branch -D master
  8. Then git fetch origin master
  9. Now, finally git checkout master

This will move you back into your master branch locally and you should be able to commence as normal from here.

Categories
Tech/IT

BIG DATA ANALYTICS SERIES – P4 – AZURE DATA FACTORY V2

Microsoft have now released a v2 of Data Factory. Though this is still in preview, it has the handy ‘Author and Deply’ tool; this includes the copy activity wizard to assist create a copy data pipeline. Most of this is the same as v1, however there are changes that have been introduced in this second iteration; I have had the fortune to be able to work with these changes and this blog is exactly about that. I will highlight the differences that Azure Data Factory v2 has brought in (as of the time of writing this), so I wouldn’t be wrong in saying that further changes and difference would most likely be on their way too. I am assuming here that anyone reading this blog has prior experience of using data factory – The following are the differences:

  1. Partitioning via a pipeline parameter – In v1, you could use the partitioning property and SliceStart variable to achieve partitioning. In v2 however, the way to achieve this behaviour is to do the following actions (This applies both when using the Copy Wizard and an ARM Template for the pipeline):
    1. Define a pipeline parameter of type string.
    2. Set folderPath in the dataset definition to the value of the pipeline parameter.
    3.  Pass a hardcoded value for the parameter before running the pipeline. Or, pass a trigger start time or scheduled time dynamically at runtime.
    4. Here is an example of the above from an Azure Resource Manager Template:
      “typeProperties”: {
                              “format”: {
                                  “type”: “ParquetFormat”
                              },
                              “folderPath”: {
      “value”:”@concat(‘/test/’, formatDateTime(adddays(pipeline().TriggerTime,0), ‘yyyy’), ‘/’, formatDateTime(adddays(pipeline().TriggerTime,0), ‘MM’), ‘/’, formatDateTime(adddays(pipeline().TriggerTime,0), ‘dd’))”,
      “type”: “Expression”
      },
      “partitionedBy”: [
      {
      “name”: “Year”,
      “value”: {
      “type”: “DateTime”,
      “date”: “SliceStart”,
      “format”: “yyyy”
      }
      },
      {
      “name”: “Month”,
      “value”: {
      “type”: “DateTime”,
      “date”: “SliceStart”,
      “format”: “MM”
      }
      },
      {
      “name”: “Day”,
      “value”: {
      “type”: “DateTime”,
      “date”: “SliceStart”,
      “format”: “dd”
      }
      }
      ]
                          },
  2. Custom Activity – In v1, to define a custom activity you had to implement the (custom) DotNet Activity by creating a .NET Class Library project with a class that implements the Execute method of the IDotNetActivity interface. In Azure Data Factory v2, for a Custom Activity you are not required to implement a .NET interface. You can now directly run commands, scripts and your own custom code, compiled as an executable. To configure this implementation, you specify the command property together with the folderPath property. The Custom Activity will upload the executable and it’s dependencies to folderPath and execute the command for you. Linked Services, Data sets and Extended Properties defined in the JSON Payload of a Data Factory v2 Custom Activity can be accessed by your executable as JSON Files. Required Properties ca be accessed using a JSON Serialiser. To create an executable for a Custom Activity you need to:
    1. Create a New Project in Visual Studio
    2. Windows Desktop Application -> Console Application (.NET Framework). Be sure you target the .NET Framework and not .NET Core otherwise at build time a .exe will NOT be created.
    3. Add in code files as needed including JSON files i.e. Linked Services etc.
    4. Once done Build the project and then open the project folder \bin\<Debug or Release>\<MyProject>.exe
    5. Upload the .exe file to Blob Storage in Azure (Make sure the executable is provided in the Azure Storage Linked Service Template). When uploading a custom activity executable to blob storage, be sure to upload All contents from the bin\Debug (or Release) folder. Just copy the entire folder to blob otherwise the custom activity will fail, as it will not be able to find any dependencies the application needs to run. Also, use subfolders when uploading custom activities. This makes it future proof in case further activities are added. Best practice for this is to use Azure Storage Explorer in which you can access the storage account and create the container and subsequent folders. This can’t be done directly in Azure because blob is a flat structure, so the concept of folders is none existent for it. However, in Storage Explorer the ‘/’ creates a pseudo hierarchy in the blob, making it a virtual folder.
    6. Create the pipeline in Data Factory v2 using Batch Service -> Custom.
    7. Create a Batch account and pool (if not already created) and set up the pipeline as normal.
    8. Trigger the run and test the pipeline.

Custom Activities run in Azure Batch, so make sure the Batch Service meets the application needs. Whilst we are on the topic of Azure Batch Services; I would like to add a note here on how to monitor Azure Batch Services. To monitor custom activity runs in Azure Batch Service Pool or an Azure Batch Service run in general, use the tool Batch Labs. Once run, you can see the stderr.txt or stdout.txt file for the run details.

Categories
Day To Day

Like, Comment, Subscribe, Tag, Repeat!

In my last blog I mentioned a reference to a dedicated blog on Social Media, and its current impact on our generation because our generation (all teens and adults aged under 30) has apparently been named the ‘Social Media’ generation. I see why we have been given this tag; I mean I don’t have any issues with this (and even if I did it’s not like anyone would care) but my main motive in this blog is to pen down the issues I see with this entire social media phenomenon. I’m also sure you have read content of a similar nature before online, so nothing new (I’m hoping) but as a tech guy, this falls right into my radar. I’m going to try and present this from the perspective of an individual who deals with tech on a daily basis and the role that social media plays within my personal and work life and its impacts.

I know right now you may be thinking that this is going to be a long lecture or essay but if that is the case then so be it; I’m going to be a lot less formal of course and bear in mind I do enjoy writing, so I may just go on for a while.

I’ve never felt more motivated than now to actually write about this; if it wasn’t for a conversation that came about with an old friend whom re-connected with me, I don’t think I would have been. So, where did this whole social media craze start? Well, it was Mr Zuckerburg, the pioneer of the biggest social media network site in the world – Facebook, who started all of this back in February 2004. I mean, there may have been other social media/networking sites in existence prior to Facebook but this was the trend setter. Who knew a University Computing project would become a worldwide craze overnight? Dude, this guy is the fifth richest man in the world – Jheeze! Sorry, going slightly off topic, coming back to social media, this was then followed by Jack Dorsey’s Twitter, of course Facebook and Twitter can’t be directly compared because they do vary from each other in their own ways, but still fall under the same social media bracket. Then as time went on we saw the likes of Instagram (now owned by Facebook) and the latest is Snapchat. Now, we also have WhatsApp, the biggest chat messenger in the world and as of recent YouTube (which is making people online sensations) both of which aren’t social media platforms, but I will include them on here anyway just because they fall straight into topic.

Most people (I say most as I still have friends who do not) have accounts on these social media platforms; it now seems to have become a norm to have an account on them otherwise people look and think about you in a different way. Weird right? I have witnessed this myself happening to some of my friends and it does make me sick to the gut – what gives people the right to judge others just because they don’t have a social media account? I mean don’t get me wrong, social media is great to keep up to date with the latest happenings around the world and follow your favourite celebrities and so forth but along with it comes growing problems, which I will discuss below.

I’ll try and not make this too generic to focus on the impact it is having on me.

As I mentioned, I’m a tech nerd so it goes without saying that I am of course on all the top social media sites, and I use them on a daily basis, some passively, others extensively. I am opening up a little about my experience; I’ve not discussed this before with anyone but believe self-reflection on your habits will only help us be a better version of ourselves and help us see our short falls from a wider perspective. It’s like looking at your own routine as an outsider; I am doing exactly this in this blog.

The first thing I do when I wake up and the last thing I do when I sleep is check my phone – you got it, Facebook, Twitter etc. And I am sure I am not the only one who has this habit, I can guarantee that a lot of us have this routine too, to the point that I also drag my phone to the toilet and find myself scrolling through my feeds. This has recently just made me realise how scary it is because I do this all subconsciously without thinking. Naturally my hands now grab my phone without me having to think about the action – I bet, pretty much all of us can manage if we forget our wallet but not our phone, right? I am also guilty of this! We all know we take our phones with us because all we really care about is taking that perfect selfie on a night out and then posting this on Insta to get lots of likes. Am I wrong? And then going back repeatedly to check how many likes and comments we get for our post. By the way I don’t have a problem with mobile phones, they are of course great in emergency situations, but I also remember the days of the Nokia 3310 and the joy I got out of playing Snake on it. That was epic! Long gone are those days though and I’m an Apple Fanatic myself but if it wasn’t for the iPhone I don’t think society would be like what it is today. Yes, smartphones are amazing, I get all that shazzle (I code aps for them myself) but it’s scary! Like really scary; all these addiction habits especially as I do all this subconsciously. The even more scary thing is that I have now seen primary school children with smartphones on Facebook and the likes and there was me getting excited over MSN Messenger (those were the days). It’s clearly evident that this is a social media addiction that is only spreading! Young children make themselves extremely vulnerable by getting themselves on social media – the increasing number of paedophilia cases are relative to this, but personally I don’t think parents should give out smartphones to their children but that’s a different subject for a different day.

As I said, I use these platforms myself daily both as part of my job and personally; I post regularly about interesting tech projects I have worked on at work and Twitter in particular is great for keeping up to date with the Tech world, upcoming hacks, tech conferences etc. Personally, I use these platforms for posting my tech blogs and promoting my graphic design work, so it does have its advantages. Putting all this aside, the point I am trying to make is that I know what I am doing where I have found that a lot of people don’t. They have simply joined these platforms for the sake of jumping onto the social media bandwagon and not wanting to feel left out. People who don’t know how to fully interact with these platforms only add more damage to themselves. Some people post their entire life on there, some to the point that they make accounts for their new-borns almost instantly. I mean come on give me a break. It is actually sad, social media is dismantling society to the point that I even forgot how to write with a pen. Yes. That is correct. I had a phase where I had not written with a pen for 6 months and then when I did I struggled, since that day, I make notes at work regularly in my notebook. Yes, I have a laptop in which I can make notes but after the experience of not being able to write properly, I don’t find that the best of ideas, we are all so habituated to checking our social media accounts, that on a day we don’t, it feels weird.

Let me tell you, I got to a point where I was legit feeling upset if I didn’t get a WhatsApp message or a Facebook like. I mean come on – this is what opened my eyes and I observed that this could possibly also be something that others may be going through. I started seeing how social media is dismantling society – people are forgetting simple every day skills like communicating with people. This becomes a lot more apparent on my morning commutes to work, when every person on the train is indulged in their own world with their headphones in and head down in their phone screen (of which I am also guilty). I am admitting that I have also done this, but as of recent I have tried to hold verbal conversations with people, which sometimes does and sometimes doesn’t go down too well with others. It just depends on the person.

I know I sound really negative about social media right now, it is because there are more cons with it than pros. Of course, there is a good and bad with everything and the same goes for this, like I said it’s great for keeping in touch with people on the other side of the globe, promotion for business and so on, but I am here focusing on more serious issues.

Social pressure is a big one in this, society only makes it worse; as I mentioned earlier, it compels you to join otherwise you feel left out and lonely.

My next point is a big one – I mean as part of me day to day job I am behind the screen as it is, but I got so indulged in all this that I realised soon enough that I was not giving enough times to those who are actually around me, my family and friends. I was so busy behind the screen I became ignorant of what was in front of me. I created a Virtual world in front of me which consisted of likes, comments and blue ticks. Until recently, it now also included a subscribe button (Yes, I am fond of my YouTube channels). This idea of being behind the screen has led to a rising number of online YouTube sensations, it’s great but we end up subscribing to a million channels and spend all day on them. But, I am so glad that I realised what I was doing sooner than later; I tried an experiment with myself. I took myself out of my comfort zone and trialled this for a month – I kept my phone way from me during work hours and when I went to sleep at night. This automatically avoided me having to check it first thing in the morning and distracted me less at work. (I used it for posting work related stuff but no WhatsApp or personal us). Automatically, I was interacting more with my surroundings and without having to make an effort naturally spending time with friends and family. I was on social media a lot less, felt a lot more happier and of course ACTUALLY felt real emotions and not virtual. It was then that I realised that the virtual reality and virtual happiness I had created for myself was only isolating me.

I feel that you can’t even openly express your feelings on social media without them being turned into controversy, we have seen it happen before and it will still continue to happen. I felt my opinions were being oppressed, so I decided to do something about it before it got any worse. I encourage using social media, but don’t make your life part of social media, make social media part of your life. Don’t let it control you, you control it. I came to finally realise that ultimately, it is those around me who will stand by me, not the number of likes I have got on a post on social media. The way things are progressing, it won’t be too long before ‘Social Media Syndrome’ will be an official medical condition. I hope this is not the case; I am glad I got my wakeup call and have controlled my social media addiction – trust me, it will only cause you more damage if you excessively Like, Comment, Subscribe, Tag, Repeat!

 

Categories
Day To Day

Double Trouble

So, I’m back with another blog.
I’ve been quiet once again, last month was birthday month! Not only my birthday, but a lot of my friends are also born in February, so you can imagine how busy I had been; one of the few times of the year where my friends and family get my undivided attention and I am away from tech. Nevertheless, I was meaning to blog sooner than this but time got the best of me.
Mentioning birthdays, this blog is exactly about that. After over 2 years of promising each other we would meet, I finally got the chance to meet up with some uni friends. This was in the making for what felt like forever; we get so busy in our work life that we forget to give time to friends and family. Like, I know we have Whatsapp, Facebook, Instagram, Snapchat and other Platforms but meeting in person is a whole different game. I’m going to be discussing the former in a different blog, so I don’t want to go off topic and stat talking about the current importance of social media in our lives.
Coming back to the topic in discussion, so yes, we finally managed to find a day, time and place where the four of us could meet and have a long catch up. It also turned out to be the weekend before my birthday, so what better reason than this, I got on the phone and booked a table at one of my favourite Restaurants ‘Feast India’. The last time I went there was when I was back at university (like over 6 years ago). Yes, seriously. This is now just making me sound like a boring, old, unsociable git, but anyway.
It worked out perfect as two of us travelled from Loughborough and the other two came from Leicester. It felt sort of weird going back to Feast India, as so many uni memories flooded back to me. This place was an eat all you want buffet, catering for all regional foods of India – my favourite of which are Punjabi, Maharati and of course Gujarati. I feasted out with delicious Paneer Tikka, Pani Puri, Sev Chaat, Pizza, Garlic Naan, Aloo Tikki, Masala Dosa and Dahi Vada. Sounds like a lot right? It was, but I only had small portions of each, I felt like I was going to pop as I have not eaten all this rich, oily food since starting my fitness journey back in September. But it did feel like heaven, I thought why not give myself the evening off and spoil myself seeing as it my Birthday coming up (and it wasn’t like I am one to celebrate so may as well eat all I can).
Just as I went for my second round of food, I noticed someone on the opposite side who resembled an old uni friend but it was difficult for me to make out if it was him or not on the first instance because he had his back turned. I took the chance and approached the individual and indeed it was who I thought. What a pleasant surprise! But he wasn’t alone, his little daughter was also with him and so was his wife (who also came to Loughborough University and was in the same batch as myself) which was a double bonus. It felt like the wait for this reunion was well worth it, and made for this day. It felt great to see and additional bunch of friends and their little munchkin (with another one on the way); we also had so much to catch up on but I think we were so dazed at the coincidental surprise we were lost for words. But as they say a picture speaks a million words, so a selfie was a must.
This reunion was surely well worth the wait; it indeed ended up being Double Trouble!
Categories
Tech/IT

Big Data Analytics Series – P3 – Service Account Authentication with Azure Hosting

A recent project required me to use the Google Analytics Core Reporting API for data ingestion. The API call was being made in an Azure function, which worked completely fine locally but failed during the service account authentication process when hosted in Azure with an ‘Invalid provider type specified’ error for the authentication certificate.

3 Days of debugging and research finally led to the answer. The Issue was not actually with the client library, but instead was to do with the way in which the authentication certificate stores X509KeyStorageFlags.

X509KeyStorageFlags define where and how to import the private key of an X.509 certificate:

  • MachineKeySet – Private keys are stored in the local computer store rather than the current user store.
  • PersistKeySet – The key associated with a PFX file is persisted when importing a certificate.
  • Exportable Imported keys are marked as exportable.

This is a known issue with Azure hosting; you need to tell the server how you would like it to deal with the X.509 certificate.

As per the API documentation, to load the private key from the certificate, the following code is needed:

var certificate = new X509Certificate2(@”<certificatePath>”, “<privatekey>”, X509KeyStorageFlags.Exportable);

This line of code will work fine locally, however will fail in Azure because we need to tell the Initializer that the private key(s) are stored in the local computer store rather than the current user store. To do this is simply adding an additional condition to the final parameter of the above line of code as shown below:

var certificate = new X509Certificate2(@”<certificatePath>”, “<privatekey>”, X509KeyStorageFlags.MachineKeySet | X509KeyStorageFlags.Exportable);

If you check the above definition of the MachineKeySet this does exactly what we need it to by telling the Initializer that the private key or keys are stored in the local store rather than the current user store.

So, the final Service Account Credential code is included in the below Github gist link:

https://gist.github.com/dbchudasama/434ff9138ca12e657ad1bf5254aafc0c

Hoping this saves anyone trying to use Service Account Authentication with Azure Hosting (not just with Google Analytics API but any other such API) hours of debugging and time.

NOTE: Replace values inside <> for your own values.

Categories
Tech/IT

BIG DATA ANALYTICS SERIES – P2 – Setting up a Mock (Local SQL Server) Data source for Data Management Gateway (DATA FACTORY)

This blog walks you through setting up a local instance of SQL Server on your machine in the aim to create a Mock Up Data Source. This blog has been written from  a technical perspective so it is assumed here that you are tech friendly. This procedure is a sub part of data integration between on-premises data stores cloud data stores using Data Factory and falls under the process of ‘Moving Data between on-premises sources and the cloud with Data Management Gateway.’ Links to full instructions for the latter part of the process will be shared below as well, but the prime focus of this blog will be on the pre-requisite aspect (as this can turn out to be a nuisance if configured incorrectly):

  1. Download SQL Express (Developer’s Edition)
  2. Install SQL Express. Select ‘Custom’ During the installation create a User. Make sure to install a ‘Database Engine’ and any other features required.
  3. Now, carry out the following checks:
    • Go into SQL Server Configuration Manager
    • Select SQL Server Network Configuration
    • Protocol for MSSSQLSERVER
    • TCP/IP needs to be enabled. Right click on TCP/IP and select enable.
    • On IP Address tab make sure you scroll down to the IPAll section and set the port number to 1433.
    • Restart the Server. To do this:
      • Go to SQL Server Services
      • SQL Server
      • Right Click ‘Restart’
        • If you have issues restarting, then just restart your machine. The server should automatically start once the machine has been restarted but nevertheless double check in configuration manager.
        • A few spot checks to see if the server is running:
          • ‘ping localhost’ in cmd line
          • Enable telnet to be able to connect to the port:
            • Run command prompt in Admin mode
            • Type the following in command prompt to enable telnet: ‘dism /online /Enable-Feature /FeatureName:TelnetClient’
            • Now open a new command prompt
            • Type ‘Telnet’ and press Enter. This will show the telnet welcome message.
          • Once telnet has been enabled you should be able to connect to local host via telnet giving it the TCP Port.
            • Open a command prompt
            • Type in ‘telnet <IP Address> <Port>’ and press enter. Port here should be 1433.
            • If a blank screen appears then the port is open, and the test is successful.
            • If you receive a ‘connecting …’ message or error message, then something is blocking the port. Most likely this could be a firewall either Windows or Third party.
  4. Connect from command line with the following (connects using SQL and Windows based auth):
  5. ‘C:\> sqlcmd -S <ip-add> -E’. NOTE: You can find the local IP Address by typing ‘Ipconfig’ in command prompt.
  6. Run the following, substituting in your own credentials:
    • Role should be minimum of ‘db_datareader’. SQL allows a user to be allocated one from three roles, ‘db_datareader’ (read permission only), ‘db_datawriter’ (read and write permissions) and ‘db_owner’ (all permisssions).
    • <login_name> and <user_name> should be the same.
      • CREATE DATABASE <db_name>
      • GO
      • CREATE LOGIN <login_name> WITH PASSWORD = N’<password>’
      • GO
      • USE <db_name>
      • GO
      • CREATE USER <user_name> FOR LOGIN <login_name>
      • GO
      • EXEC sp_addrolemember ‘db_owner’, ‘<username>’
      • GO
      • exit
  7. Now, connect with the test account created by running the following in command prompt:
    • sqlcmd -S <ip-add> -d <db_name> -U <login_name> -P <password>
  8. You should now be at SQL Prompt having successfully connected.
  9. You should now connect to the server via SSMS (SQL Server Management Studio) using the above created credentials:
    • Open SSMS
    • Select ‘Database Engine’
    • For the server name, use the computer name if your server is called ‘MSSQLSERVER’. This means you are using an unnamed instance of the server. If your server is called ‘SQLEXRPRESS’ then you are using a named instance and will have to use the following syntax ‘localhost\SQLEXPRESS’.
    • Username: username created above
    • Password: password created above
    • Either authentication method should work.
    • Now, commence in SSMS as normal
    • You can continue and set up the Mock Database Gateway and ADF Pipeline in Azure. Go on the below link to read a full walk through on how to do this. You would also follow the below link for a real data source gateway set-up, just configuring the input data set to be the actual data source:

https://docs.microsoft.com/en-us/azure/data-factory/v1/data-factory-move-data-between-onprem-and-cloud

The same set up can be completed using a VM (Virtual Machine). Follow the same steps but firstly deploy a Virtual Machine in Azure, then connect to it via Remote Desktop and continue as normal.

 

Categories
Tech/IT

All Things Cloud at Microsoft

Finally, I have managed to get around to this blog as its long overdue because of the Christmas and New Years, but as they say better late then never!

If you are a regular reader of my blogs then you may recall me mentioning this a couple of times in my previous blogs, but thanks to Elastacloud, I had the opportunity to attend a special Azure Technical Briefing at the Microsoft Paddington Central Office in London. This is an exclusive inside event for Azure/Cloud developers (like myself) where Microsoft reveal all the latest developments in Azure and the world of Cloud Computing, but from a very technical perspective.

It is a very high demand event from which only a handful of candidates are selected to attend. I was lucky because not only did my boss have close ties with Microsoft but the man organising the event is his good friend. Thanks to this I skipped the reservation queue and got direct entry – wicked! In fact, it was my boss himself who had informed me about this Technical briefing encouraging me to sign up and then leave the rest to him, which was super beneficial for me and that’s what counts.

I should have just taken the train but me being me decided to travel by car. Yes. Travelling by car to London, that too on a Weekday and in the morning – not the cleverest of idea’s. I realised this when it took me 4 hours to get to the Microsoft Office from my house (Loughborough to Paddington is 112 miles – 2 hours 19 mins) so you can clearly see how bad traffic must have been. I did have a bit of a panic frenzy thinking I was late (registration started at 9:30am with the day beginning at 10am) as I got there at 10:10am. However, I came to realise that over half of the attendees hadn’t yet shown up, which was a great sigh of relief. Before you say I’m a pretty punctual guy most of the time, I usually arrive at places 10 mins before the start time but sometimes I am also a victim of “Indian Timing”, which I try to avoid if I can.

I was like a kid at a comic con (ok maybe not quite exactly like that but that’s the closest description I can find to explain how I felt). I was buzzing (I don’t think I had used Snapchat throughout the year as much as I did on that day lol). I was snapping my time at the office as much I could, the classy glass building and super cool elevators were quite something. I knew this was the sort of place I had always wanted to visit, and now this wish had become a reality. To add to this ecstatic feeling, I was also dressed in smart clothing (which I enjoy surprisingly) which further boosted my enthusiasm and excitement. I was at MICROSOFT!

At the classy reception desk, I had to check in, take my badge and then got directed to floor 5 where the Briefing was taking place. I thought only developers would be attending the day, but I was joined by project managers, business analysts etc. Not everyone was a coder or from a programming background, which did take me by surprise, but the briefing was organised into multiple sections each targeting different role levels within the business architecture. So, some sections were delivered from a project manager perspective, other from a business analyst point of view, whilst others were very developer based, heavy in code and technical language. It’s obvious where I fit in, so I don’t think I need to mention much on this.

Without going into the technical aspects (as I am sure you don’t want to know or care about this) I will summarise an overview of what was covered in the day:

The briefing was split twofold – Continuous Integration/Continuous Deployment (CI/CD) and Visual Studio Team Services (VSTS).

CI/CD covered the following:

  • Automated Testing
  • Release Management
  • Usage Monitoring
  • Code Reviews
  • Continuous Measurement
  • Feature flags
  • Infrastructure as Code/Infrastructure As A Service
  • Configuration Management

 

VSTS focused on:

  • Agile Project Management for Visual Studio Team Services
  • Using Pull Requests with VSTS
  • Moving from Subversion to Git
  • Creating CI/CD Pipeline with VSTS into Azure
  • .NET Development in Azure with VSTS
  • Build in Azure and deploy on-premises with VSTS
  • Container based deployments with Docker, Kubernetes, Azure and VSTS

I was fortunate to have already covered some of the topics from subjects from my previous projects at work, so it was good to have prior knowledge and hands on experience to allow me to relate to the information. It was an insightful day and I found it helpful. My favourite topics were Release Management and Infrastructure As A Service (IaaS) – these were really cool!

The icing on the cake was being able to speak and take a selfie with the legendary Edward Thomson, the man behind Visual Studio Team Services. He is the man who wrote the code that merges pull requests for developers – Git Project Manager for Microsoft Visual Studio Team Services. This was truly EPIC (as seen below)!

I also had the opportunity to speak to David Gristwood (Technical Evangelist at Microsoft, the man who had organised this briefing and my boss’ friend) over lunch, which was also a pleasant experience.

The day closed off at 4:30pm (it over ran as it was supposed to finish at 4pm) – I made a few good friends at the event through networking and I hope to be meeting them again at similar events in the future (we have exchanged numbers so regularly discuss upcoming events and attendance). Last but not the least, what is signing out of Microsoft without striking a pose? That’s exactly what I did before leaving – Microsoft, thank you for having me, it was a pleasure!