Contact

Subscribe via Email

Subscribe via RSS/JSON

Categories

Creative Commons Attribution 4.0 International License
© Rakhesh Sasidharan

Elsewhere

Batman vs Superman

I saw “Batman vs Superman: Dawn for Justice” for the second time today. It was on Netflix and while I didn’t enjoy it much when I saw it the first time (when it was released in the theaters) I thought I’d give it a go anyways. Good decision coz I absolutely loved it!

The first time when I saw it I found the movie pointless. Why were Batman and Superman fighting? Why was Batman so angry about things. Why was Lex bent on creating misunderstandings between them. Why was everything so intentionally dark and gloomy. So many things I didn’t like!

This second time however, I saw the movie in a different light. There’s a drama to it, a certain “theater”… like in a play or even like Zack Snyder’s own “300”. When I saw this movie the first time I still had the Christopher Nolan Batman in my head and so I wanted a grounded movie. I didn’t want “cinema” I just wanted a character driven Batman and Superman movie. But that’s not what “Batman vs Superman” is about and I am surprised I missed the whole point in the first viewing!

Or maybe I have changed since that viewing. I know for instance many audiobooks I enjoyed (or not) the first time around sometimes being out the opposite reaction in me on a second hearing. Maybe this ones like that. Maybe this time I was more open and attuned to the iconography in this movie.

“Batman vs Superman” is in the difficult position of being an in between movie. We have no backstory for Bruce Wayne short of the intro sequence and all we know is that he has been doing this for a long time, that he has already faced the Joker, that Robin is probably dead… this is not the grounded or older Batman of the Chris Nolan trilogy but a pessimist and angry Batman. Into this comes Superman and the all context of him being a God. The theme here is not about Superman being an alien (as in “Man of Steel”), rather it’s about him being a God, a savior for mankind. And that’s where the whole question of is he really a God, or a Devil hiding behind the mask of a God, or even of whether he is a False God (i.e. one that can bleed, a reference to the Persian King and the scene from “300”) comes in. I missed all of this the first time. The references to the False God, the painting in Lex’s office, a lot of Greek references, the amazing scenes such as the one in Mexico on the Day of the Dead or even when Superman is dead and everyone’s holding his body… this movie is all about the scenes, the “cinema” itself than just characters or a story… it’s Greek drama on the big screen with larger than life characters. God vs Man after all!

Back to what I was saying: this is an inbetween movie. It’s a part of the overall arc that would have been but now wouldn’t happen (because it’s canceled). There’s Darksied, there’s a the evil Superman, there’s all that stuff which would have come out if the studio would have just stuck with it… and then when we watch “Batman vs Superman”‘in the entirety of that storyline it would make a lot more sense too. That’s not going to happen unfortunately and even the “Justice League” movie has a different tone from what I remember… our loss! Cheers to Zack Snyder though for creating this one. It’s worth every scene!

Useful NPS & certificate stuff (for myself)

Came across an odd problem at work the other day involving NPS and Wireless APs. We have an internal wireless network that is set to authenticate against Microsoft NPS using certificates. The setup is quite similar to what is detailed here, with the addition of using an internal CA issued certificates for NPS to authenticate the users (as detailed here or here for instance). 

All wireless clients stopped being able to connect to the wireless. That’s when I realized the logs generated by NPS (at C:\Windows\System32\Logfiles) are horrendous. One option is to change the log format to “IAS (Legacy)” and “Daily” and use a script such as the one here to analyze. Side by side it is also worth changing the format to “DTS Compliant” as that produces a better readable XML output. All of this stuff is in the “Accounting” section BTW: 

NewImage

Pro Tip: If you go with the XML format and use Visual Studio code, you can prettify the XML as mentioned here

From the logs we could see entries like this:

    <Authentication-Type data_type="0">5</Authentication-Type>
    <Packet-Type data_type="0">3</Packet-Type>
    <Reason-Code data_type="0">259</Reason-Code>

In this case the packet type data of 3 means the access was rejected, and the reason code 259 means CRL check failed. (Nope, I don’t know these codes of the top of my head! My colleague who did the troubleshooting came across this. If you use the PowerShell script I mentioned above that converts some of the codes to readable values, but it too missed error 259). If you want to read more about the flow of traffic an why rejection might happen, this article is a good read. 

We didn’t really get to the bottom of this issue (it looks to be one of those random issues) but I spent some time reading up on certificates and NPS etc. so want to put that info here. Mainly, certutil. This tool can be used to check CRLs etc. I still haven’t gotten to the bottom of the above issue (why NPS couldn’t retrieve CRLs) but I picked up a bit of CRL stuff while troubleshooting so wanted to note that somewhere. 

The command certutil /crl (from an admin command prompt on the CA) causes it to publish the CRL. In my case it was via LDAP, and the command returned no errors. You can find the CRL URL from any certificate. In my case it was a long LDAP URL that looked something like this: ldap:///CN=blahblah,xxxxl?certificateRevocationList?base?objectClass=cRLDistributionPoint.You can use certutil /url with the URL to query it. You can also use ADSI Edit to view the configuration partition and go to the URL to see the last modified timestamp etc. 

The certutil command has many more useful switches (like in this blog post and this wiki entry – the latter has many more examples). For example you can export a certificate to a file and then run a command such as certutil /verify /urlfetch \path\to\certificate.cer. This will verify the certificate up the chain, and also check the CRL specified in the certificate. 

It is also possible to export a CRL from the CA: certutil /getcrl \path\to\file.crl. You can also view the exported CRL via a command like: certutil /dump \path\to\file.crl. Lastly you can import it to a different server via: certutil /addstore CA \path\to\file.crl

In our case we ended up exporting the CRL from the CA and importing to the NPS server to quickly workaround the issue. 

Later I learnt that there’s a reg key which can be used to disable CRL checking by NPS. Not that you want to do that permanently, but useful as a quick fix. Another thing I learnt is that there’s a reg key that controls how long the NPS server caches the TLS handle of authenticated computers. By default it is 10 hours, but can be extended. 

Google search for Apple Music is better than Apple Music search!

It’s annoying how good Google search is. Many a times I search for a song in Apple Music, don’t find it, and think it’s not there. But then I do a Google search for “<song name> iTunes” and bam! it returns me an iTunes link I can click to open the song in Apple Music. :) Neither Bing nor DuckDuckGo do this! It’s irritating because Apple Music should be doing this in the first place (it’s funny, right, that Google indexes Apple Music better than Apple itself) and one more reminder as to how google Google is for searching even with all its privacy concerns etc. 

Chekka Chivantha Vaanam

Saw this one today. 

  • Great songs by A.R. Rahman.
    • But not integrated well into the movie. They distract from the movie than add to it. Most of them seem like they are placed just because we have to place the songs somewhere. 
    • DIdn’t like the background score much either.
  • The story seemed kind of directionless. It was marketed as a violent thriller of 3 sons fighting for their fathers’ empire. That fight doesn’t start until after the intermission, and even then we don’t really care for it.
  • The women seem to be there just for skin. Except Jyothika who has somewhat of a role, the rest are wasted. Which sucks coz they seemed interesting and just ignoring them for the three men didn’t do justice for them. 
  • Good to see Aravind Swamy after a long time! 

That’s it really. It was an ok 2.5 hours. I could have spent it watching something else more worth my time I guess … but ah well, Mani Rathnam movie, I wanted to watch it … and even though I was bored I kept on in the hopes that things might turn out to be interested. (Hint: they didn’t!). 

Note to self: PowerShell can do JSON and Invoke-WebRequest for REST API calls

I am so bummed at myself! Proud, but also bummed. 

At work we are doing some migration work and the vendor we are migrating data to has a REST API which we can talk to using curl and pass data as JSON. I spent the last two weeks creating various bash scripts that can send and receive JSON and while I did a good job (in my opinion), and learnt a lot of things (discovered jq for instance, it’s amazing!), and it was a great working with bash and sed and awk and all these *nix tools after such a long time (and all this was done on the macOS this time, so was a good way to send time on the macOS CLI too) … I now realize that doh PowerShell supports JSON and I could have used Invoke-WebRequest for all my curl calls, so I could have done the whole work in PowerShell … a much more familiar environment! In the process I could have saved some time and taken a lot less stress.

That’s why I am bummed. I am proud I did a good, but I also kind of wish I had been more aware of what PowerShell can do and taken the effort to Google a bit about it. 

Thing is I have a huge soft corner for bash and all these things, so I know it’s my internal bias that just pushed me to jump at the opportunity and work with this rather than check out PowerShell. I do love me bash and sed and all those. :)

Nightflyers

Been kind of binge watching “Nightflyers” on Netflix. The show doesn’t make much sense to me, and all the characters are kind of weird / dumb and yet I am intrigued and keep watching. There’s probably a better way to spend 10 hours of my life than watching this, but I dunno … part of me wants to see where this goes. I guess it’s because the show began on a high note, with one of the characters killing others (killing everyone maybe?) and so I want to know how that came to be. But I just can’t make sense of the actions of the characters. There’s just a lot of things – alien race, some Teke energy, some humans called L’s, computers, virtual reality, a girl who can plug into computers … it’s like someone decided to blend all these together and see what comes out of it. There’s not story or direction as such. It’s just going somewhere and the only thing keeping me interested is why the killings in the beginning happened, and that maybe all these irrational behavior is due to the alien Volcryn influence. If I had to pick a crew for an alien space expedition, this would definitely not be the bunch I go with. 

I am still very surprised I didn’t just dump it. Goes to tell how a suspenseful beginning can keep you hooked. Maybe that was the idea of the writers too. :) 

I ditched “Titans” after about 4-5 episodes. It was similarly pointless and I stopped caring for the characters. 

Speaking of stuff worth spending time on though, I loved this podcast interview with M. Night Shyamalan. I love M. Night Shyamalan movies, and I especially loved “Unbreakable”. To me, “Unbreakable” is a story idea that I had (seriously) but much much much better executed by M. Night. For me it was just a cool idea in my head of how the world might be, but seeing it on screen was just magical. I didn’t know the movie didn’t fare that well until recently though. For me “Unbreakable” and “Signs” are two of M. Night Shyamalan’s best movies (and top in my list of favorite movies). Both are kind of similar in one level – faith, reason, why – but very different too. I haven’t seen his “The Visit”, so got to watch that now. I dunno how but I missed that out (well I know how, I lost interest in his movies after “The Last Airbender” and that TV show me made – “Wayward Pines”). 

Another good podcast episode I listened to recently is this interview with Christina Warren. I had previously heard Christina on the TWiT but this was my first time hearing her being interviewed and it was a fun episode. I came across some interest Mac app suggestions too from her. 

Apple Music sounds better than Spotify

I use both Apple Music and Spotify. And I pay for both too (esp. Spotify as I prefer the higher quality music). I always felt however that the same song sounds better on Apple Music than Spotify, though until today I didn’t read more into this. Turns out that, yeah, Apple Music uses 256kbps AAC while Spotify is 320kbps Ogg Vorbis (don’t be fooled by the numbers, AAC is a better format the Ogg Vorbis so the 256kbps actually translates to something higher if we compare like for like). Am glad to hear that in a way coz Apple Music is my primary music player, but I am also bummed to realize that I may not be getting the best possible quality with Spotify. 

I love Spotify for being able to discover new music. I like its UI, and I find myself turning to it when I am in the mood to discover new stuff. With Apple Music I have a bunch of playlists etc., but often I am just in the mood for someone to make a decision for me. With Spotify I can go to the Discover section and it usually points me to something good. I have discovered so much new music through that. They have great play lists, and most of the time I enjoy whatever it points me to. 

Should I be cheap and use Spotify purely for discovery and actually play the discovered song in Apple Music? I guess not. That’s not a very smooth workflow. Also, Spotify isn’t bad if I am listening on speakers. It’s only when I have my good headphones on that I notice the difference. I should just remember to use Apple Music if I am on headphones. (Also, the type of the music matters. If I am listening to movie scores or something classical, then the quality matters. General pop or fusion etc. aren’t that fussy about quality). 

Ideally I should be signing up for a lossless streaming service like Tidal, but that isn’t available in Dubai yet. Sucks!

macOS: find app using Secure Input

Ran into an irritating problem today with my task switcher Contexts. It stopped working and there was an orange exclamation mark in the menu bar saying some application has Secure Input turned on and until I close it Contexts can’t work. Initially it told me that Firefox was responsible, and I was about to close it when I realized that whenever I switch to a different app it blames that app as having Secure Input turned on. 

So clearly the issue is elsewhere. This page gives you a list of apps that can usually have Secure Input turned on. Thanks to this forum post I learnt that you can find the offending app by running the following command:

ioreg -l -w 0 | grep SecureInput

Find the process ID from the kCGSSessionSecureInputPID field. Then use a activity monitor (easier, you can sort) or the following command:

ps auxww | grep <pid>

In my case the culprit turned out to be loginwindow. I tried to kill it but the system warned me that it would log me out. I was in no mood to get logged out, so upon a whim I tried locking and unlocking the system. That worked! :)

A simpler setup; no more external keyboards

I spent a lot of money on keyboards this past 2 months. I started with the Kinesis FreeStyle 2 Blue for Mac and while I loved it a lot I didn’t like the delays because of BlueTooth. I should have just purchased the wired version instead, but I decided to go for some other brand instead (just to try things out) and went with the even more expensive Das Keyboard 4 Professional for Mac. I had high hopes for it, being a mechanical keyboard and NKRO etc. but today I decided to return it. It is a good keyboard, mind you, but I dunno I wasn’t that blown away by it. I mean not blown away enough to justify the super high cost; and it wasn’t entirely pleasurable to type on either. As with other keyboards I’d find the space and X keys to sometimes stick – I think it’s just me, I am jinxed with these keys. I hate delays. I tend to type very fast, and the smallest of delays irritate me. And I don’t like stuck keys (no one does, I guess). Plus coming from the split keyboard design of the Kinesis FreeStyle 2, the Das Keyboard felt very cramped. 

What I loved about the Das Keyboard was its integrated USB 3 hub and also the giant volume button and media keys. These were a pleasure and these are what I am going to genuinely miss. 

Why am I going to miss these? Because I decided to simplify things. The whole reason for going with an external keyboard was for better ergonomics, but somehow it wasn’t working out. I don’t mind the MacBook Pro keyboard, and in fact I had gotten used to it the first few months; and while these external keyboards are way better to use over the MacBook Pro keyboard due to the extra travel etc., I realized I better get used to the MacBook Pro keyboard. For one, I will be traveling and at that point I can’t lug around additional keyboards; and for another, my desk was having a complicated look with the extra keyboard and its cable etc. If I go wireless I notice a lag. If I go wired I still notice the occasional lag, plus the keyboard wires mess up my desk. Plus, since I have a keyboard and 2 screens, I have to keep the MacBook Pro at a distance and not really use it as a 3rd screen (except for say having iTunes open on it) and I wasn’t too happy with that either. 

So enough with all this mess. Keep things simple. MacBook Pro + it’s inbuilt keyboard (which needs some getting used to, granted). Plus the two screens that I already had. Plus an extra trackpad (which I already had, coz I prefer the extra trackpad so I can use my lefthand too for tracking). Plus the Logitech Mx Master 2S mouse (which too I had purchased as part of my experimenting around, and which is useful for my right hand when I need a mouse as such). It’s a good feeling to be honest. Removing the keyboard and putting the MacBook Pro back at the centre of my table kind of brings a symmetry. I use the inbuilt MacBook Pro speakers for casual music listening and now they are back to being in the centre of my desk and so the music isn’t coming from one side of the desk (where the MacBook previously was). I can use all 4 USB-C ports as the MacBook Pro is again centered and so I don’t have to worry about distance. The headphone jack too is nearby so I can skip the long extender cable I had – and that’s one less cable on my desk. Everything is great, except that I have to get used to the butterfly keyboard again. :) That takes some getting used to. I mean, I don’t mind the keyboard, and I can type reasonably well without mistakes on it (and the autocorrect helps a lot) but it is not a 100% error free and a 100% love. Something with more travel would definitely have been preferred. But well, you can’t get everything. Am pretty sure my hand is going to start hurting soon. :(

Another advantage of removing the keyboard is that I feel “free-er”. As in, now I am no longer tied to my desk or need all this extra paraphernalia. If I am bored of sitting at my desk or want to take the MacBook Pro elsewhere, I can just unhook it and go away. I don’t need to carry around the keyboard etc. coz I am not longer used to it. I have a StarTech Thunderbolt to Dual HDMI adapter that drives my two monitors, so unplug that and simply move on … whenever I need to. A very nice thing about the macOS is that when I later plug in the adapter again, it automatically places the windows back on the screens they were at. So convenient! (The window manager isn’t without its faults, but I love this feature and also the fact that even after a reboot it places everything back the way they were). 

Update: Freedom is … sweaty

This is for anyone else who Googles on “sweaty palms and MacBook Pro” and doesn’t find thing of use. Yes, there are plenty of results, but those are mostly to do with how to protect the keyboard for your sweaty palms – not about what causes them in the first place! I don’t have much laptop experience (I usually plug in a keyboard) so maybe it’s not a MacBook thing, maybe it’s all laptops, but I noticed that when I use the MacBook Pro keyboard my palms are more sweaty. After one day of using the MacBook Pro without keyboard, my palms are constantly sweating. And I know this is not a on time thing coz the previous time (a 2 week stretch) when I did us the MacBook Pro directly without any keyboard I had the same issue. At that time I Googled and convinced myself the issue is with me rather than anything else … and just moved on (got myself an external keyboard basically, later on). Today I realize it is probably coz of the keyboard. 

Like I said, may not be MacBook Pro related as such. But the lack of travel of the butterfly keyboard does stress my palms and that’s probably a large contributing factor. I hate this lack of travel, and I find the layout a bit having to get used to … my fingers are definitely a lot more cranky since I started with this keyboard. 

Update 2: I tried, but decided to give up. I sit long hours with the MacBook Pro so it is not good posture either for me to sit bent on it. I should ideally be staring at a monitor or the MacBook Pro raised – like I was doing so far – so why am I trying to disable myself by taxing both my neck and hands with poor posture. If I raise the MacBook Pro in the centre of my desk, and use the keyboard again with it, neither am I hurting my fingers nor am I hurting my neck. So this post eventually turned out to be about nothing as I am back to where I started. :D 

Deploying Office 2016 language packs (using PowerShell Admin Toolkit)

I need to deploy a language pack for one of our offices via ConfigMgr. I have no idea how to do this! 

What they want is for the language to appear in this section of Office:

NewImage

I don’t know much of Office so I didn’t even know where to start with. I found this official doc on deploying languages and that talked about modifying the config file in ProPlus.WW\Config.xml. I spent a lot of time trying to understand how to proceed with that and even downloaded the huge ISOs from VLSC but had no idea how to deploy them via ConfigMgr. That is, until I spoke to a colleague with more experience in this and realized that what I am really after is the Office 2016 Proofing Toolkit. You see, language packs are for the UI – the menus and all that – whereas if you are only interested in spell check and all that stuff what you need is the proofing tools. (In retrospect, the screenshot above says so – “dictionaries, grammar checking, and sorting” – but I didn’t notice that initially). 

So first step, download the last ISO in the list below (Proofing Tools; 64-bit if that’s your case). 

NewImage

Extract it somewhere. It will have a bunch of files like this:

NewImage

The proofkit.ww folder is your friend. Within that you will find folders for various languages. You can see this doc for a list of language identifiers and languages. In the root of that folder is a config.xml file with the following –

By default this file does nothing. Everything’s commented out as you can see. If you want to additional languages, you modify the config.xml first and then pass it to setup.exe via a command like setup /config \path\to\this\config.xml. The setup command is the setup.exe in the folder itself. 

Here’s my config.xml file which enables two languages and disables everything else.

Step 2 would be to copy the setup.exe, setup.dll, proofkit.ww, and proofmui.en-us to your ConfigMgr content store to a folder of its own. It’s important to copy proofmui-en.us too. I had missed that initially and was getting “The language of this installation package is not supported by your system” errors when deploying. After that you’d make a new application which will run a command like setup.exe /config \path\to\this\config.xml. I am not going into the details of that. These two blog posts are excellent references: this & this.

At this point I was confused again, though. Everything I read about the proofing kit made it sound like a one time deal – as in you install all the languages you want, and you are done. What I couldn’t understand was how would I go about adding/ removing languages incrementally? What I mean is say I modified this file to add Spanish and Portugese as languages, and I deploy the application again … since all machines already have the proofing kit package installed, and it’s product code is already present in the detection methods, wouldn’t the deployment silently ignore?

To see why this doesn’t make sense to me, here are the typical instructions (based on above blog posts):

  • Copy to content store
  • Modify config.xml with the languages you are interested in 
  • Create a new ConfigMgr application. While creating you go for the MSI method and point it to the proofkit.ww\proofkitww.msi file. This will fill the MSI detection code etc. in ConfigMgr. 
  • After that edit the application you created, modify the content location to remove the proofkit.ww part (because we are now going to run setup.exe from the folder above it), and modify the installation program in the Programs tab to be setup.exe /config proofkit.ww\config.xml.

NewImage

NewImage

Notice how the uninstall program and detection method both have the MSI code of the MSI we targeted initially. So what do I do if I modify the config.xml file later and want to re-deploy the application? Since it will detect the MSI code of the previous deployment it won’t run at all; all I can do is uninstall the previous installation first and then re-install – but that’s going to interrupt users, right? 

Speaking to my colleagues it seems the general approach is to include all languages you want upfront itself, then add some custom detection methods so you don’t depend on the MSI code above, and push out new languages if needed by creating new applications. I couldn’t find mention of something like this when I Googled (probably coz I wasn’t asking the right questions), so here goes what I did based on what I understood from others. 

As before, create the application so we are at the screenshot stage above. As it stands the application will install and will detect that it has installed correctly if it finds the MSI product code. What I need to do is add something extra to this so I can re-deploy the application and it will notice that inspite of the MSI being installed it needs to re-install. First I played around with adding a batch file as a second deployment type after the MSI deployment type, having it add a registry registry. Something like this:

This adds a key called OfficeProofingKit2016 with value 1. Whenever I change my languages I can update the version to kick a new install. I added this as a detection to the batch file detection type, and made the MSI deployment type a dependency of it. The idea being that when I change languages and update the batch file and detection method with a new version, it will trigger a re-run of the batch file which will in turn cause the MSI deployment type to be re-run. 

That turned out to be a dead end coz 1) I am not entirely clear how multiple deployment types work and 2) I don’t think whatever logic I had in my head was correct anyways. When the MSI deployment type re-runs wouldn’t it see the product is already installed and just silently continue?! I dunno. 

Fast forward. I took a bath, cleared my head, and started looking for ways in which I could just do both installation and tattooing in the same batch file. I didn’t want to go with batch files as they are outdated (plus there’s the thing with UNC paths etc). I didn’t want to do VBScript as that’s even more outdated :p and what I really should be doing is some PowerShell scripting to be really cool and do this like a pro. Which led me to the PowerShell App Deployment Toolkit (PSADT). Oh. My. God. Wow! What a thing. 

The website’s a bit sparse on documentation but that’s coz you got to do download the toolkit and look at the Word doc in there and examples. Plus a bit of Googling to get you started with what others are doing. But boy, is PSADT something! Once you download the PSADT zip file and extract its contents there’s a toolkit folder with the following:

NewImage

This folder is what you would copy over to the content store of whatever application you want to install. And into the “files” folder of this is where you’d copy all the application deployment stuff – the things you’d previously have copied into the content store.  You can install/ uninstall by invoking the Deploy-Application.ps1 file or you can simple run the Deploy-Application.exe file. 

NewImage

Notice I changed the deployment type to a script instead of MSI, as it previously was. The only program I have in that is the Deploy-Application.exe

NewImage

And I changed the detection method to be the registry key I am interested in with the value I want. 

NewImage

That’s all. Now for the fun stuff, which is in the Deploy-Application.ps1 file. 

At first glance that file looks complicated. That’s because there’s a lot of stuff in it, including comments and variables etc., but what we really need to concerting ourselves with is certain sections. That’s where you set some variables plus do things like install applications (via MSI or directly running an exe like I am doing here), do some post install stuff (which is what I wanted to do, the point for this whole exercise!), uninstall stuff etc. In fact, this is all I had to add to the file for my stuff:

That’s it! :) That takes care of running setup.exe with the config.xml file as an argument. Tattooing the registry. Informing users. And even undoing these changes when I want to uninstall.

I found the Word document that came with PSADT and this cheatsheet very handy to get me started.

Update: Forgot to mention. All the above steps only install the languages on user machines. To actually enable it you have to use GPOs. Additionally, if  you want to change keyboard layouts post-install that’s done via registry key. You can add it to PSADT deployment itself. The registry key is HKEY_CURRENT_USER\Keyboard Layout\Preload. Here’s a list of values.

macOS Terminal make Ctrl-Left go to beginning of the line

In most places on macOS the Ctrl-Left key makes the cursor go to the beginning of the line. But in the terminal, when I press Ctrl-Left it puts in the following ;5D.

To change this, if one is using bash, we can use the bind command. You can type bind -p to see all the keyboard bindings in bash (thanks to this blog post). 

I need to map Ctrl-Left to the beginning -of-line function. 

If I open Terminal and go to the Keyboard section I see Ctrl-Left is mapped as \033[1;5D

NewImage

So all I need do is the following two commands (one for beginning of line, another for end). 

bind '"\033[1;5C": end-of-line'
bind '"\033[1;5D": beginning-of-line'

Add these to .bashrc and it’s always bound in every new session. 

While in the Keyboards section also tick the “Use Option as Meta key”. This lets you use the Alt/ Option key in the Terminal. So I can do Alt-Left and Alt-Right to go back and forth one word. 

NewImage

Unable to install a Windows Update – CBS error 0x800f0831

Note to self for next. 

Was trying to install a Windows Update on a Server 2012 R2 machine and it kept failing. 

Checked C:\Windows\WindowsUpdate.log and found the following entry:

2B00-40F5-B24C-3D79672A1800}	501	0	wusa	Success	Content Install	Installation Started: Windows has started installing the following update: Security Update for Windows (KB4480963)
2019-01-29 10:27:36:351 832 27a0 Report CWERReporter finished handling 2 events. (00000000)
2019-01-29 10:32:00:336 7880 25e8 Handler FATAL: CBS called Error with 0x800f0831,
2019-01-29 10:32:11:132 7880 27b4 Handler FATAL: Completed install of CBS update with type=0, requiresReboot=0, installerError=1, hr=0x800f0831

Checked C:\Windows\Logs\CBS\CBS.log and found the following:

2019-01-29 10:31:57, Info                  CBS    Store corruption, manifest missing for package: Package_1682_for_KB4103725~31bf3856ad364e35~amd64~~6.3.1.4
2019-01-29 10:31:57, Error CBS Failed to resolve package 'Package_1682_for_KB4103725~31bf3856ad364e35~amd64~~6.3.1.4' [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS Mark store corruption flag because of package: Package_1682_for_KB4103725~31bf3856ad364e35~amd64~~6.3.1.4. [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS Failed to resolve package [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS Failed to get next package to re-evaluate [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS Failed to process component watch list. [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS Perf: InstallUninstallChain complete.
2019-01-29 10:31:57, Info CSI 0000031d@2019/1/29:10:31:57.941 CSI Transaction @0xdf83491d10 destroyed
2019-01-29 10:31:57, Info CBS Exec: Store corruption found during execution, but auto repair is already attempted today, skip it.
2019-01-29 10:31:57, Info CBS Failed to execute execution chain. [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Error CBS Failed to process single phase execution. [HRESULT = 0x800f0831 - CBS_E_STORE_CORRUPTION]
2019-01-29 10:31:57, Info CBS WER: Generating failure report for package: Package_for_RollupFix~31bf3856ad364e35~amd64~~9600.19235.1.5, status: 0x800f0831, failure source: Execute, start state: Staged, target state: Installed, client id: WindowsUpdateAgent

So looks like KB 4103725 is the problem? This is a rollup from May 2018. Checked via DISM if it is in any stuck state, nope!

dism /online /get-packages /format:table  | findstr /i "4103725"

I downloaded this update, installed it (no issues), then installed my original update … and this time it worked. 

[Aside] Exporting Exchange mailboxes to PST (with the ‘ContentFilter’ date format fixed)

Quick shoutout to this blog post by Tony (cached copy as I can’t access the original). The cmdlet has a -ContentFilter switch where you can specify the date and other parameters by which you can filter out what is exported. Very irritatingly the date parameter expects to be in US format. And even if you specify it in US format to begin with, it converts it to US format again switching the date and month and complaining if it’s an incorrect date. Thanks to this blog post which explained this to me and this blog post for a super cool trick to work around this. Thank you all! 

VMware HPE 3Par recommended settings

Made the following script yesterday to run on all our ESXi hosts to apply HPE recommended settings for 3Par.

No original contribution from my side here. It’s just something I put together from stuff found elsewhere.

I wanted to run this as a cron job on ESXi periodically but apparently that’s not straightforward. (Wanted to run it as a cron job because I am not sure the queue settings too can be set as a default for new datastores). ESXi doesn’t keep cron jobs over reboots so you to modify some other script to inject a new crontab each time the host reboots. I was too lazy to do that.

Another plan was to try and run this via PowerCLI as I had to do this in a whole bunch of hosts. I was too lazy to do that either and PowerCLI seems a kludgy way to run esxcli commands. Finally I resorted to plink (SSH was already enabled on all the hosts) to run this en-masse:

This feels like cheating, I know. This requires SSH to be enabled on all hosts. This assumes I have put the script in some common datastore accessible across all hosts. I am using PowerShell purely to loop and read the contents of a text file consisting of “hostname: password” entries. And I am using plink to connect to each host and run the script. (I love plink for this kind of stuff. It’s super cool!) It feels like a hotch-potch of so many different things and not very elegant but lazy. (Something like this would be elegant. Using PowerCLI properly; not just as a wrapper to run esxcli commands. But I couldn’t figure out the equivalent commands for my case. I was using FC rather the SCSI).

Demoting a 2012R2 Domain Controller using PowerShell

Such a simple command. But a bit nerve racking coz it doesn’t have much options and you wonder if it will somehow remove your entire domain and not just the DC you are targeting. :)

Uninstall-ADDSDomainController

You don’t need to add anything else. This will prompt for the new local admin password and proceed with removal.