Contact

Subscribe via Email

Subscribe via RSS/JSON

Categories

Creative Commons Attribution 4.0 International License
© Rakhesh Sasidharan

Elsewhere

[Aside] SPNs

Trying to get people at work to clean up duplicate SPNs, and came across some links while reading about this topic. 

From the official MSDN article: A service principal name (SPN) is a unique identifier of a service instance. SPNs are used by Kerberos authentication to associate a service instance with a service logon account. This allows a client application to request that the service authenticate an account even if the client does not have the account name.

Basically when a client application tries to authenticate with a service instance and the domain controller needs to issues it Kerberos tickets, the domain controller needs to know whose password to use for the service instance – is it that of the server where this instance runs, or any service account responsible for it. This mapping of service -> service account/ computer account is an SPN. It’s of the format service/host:port and is associated with the AD account of the service account or computer account (stored in the servicePrincipalName attribute actually).

That’s all!

[Aside] DFRS links

Just putting these here as bookmarks to myself.

One of our DCs at work had the following DFSR warnings in the DFS Replication logs:

The DFS Replication service stopped replication on volume C:. This occurs when a DFSR JET database is not shut down cleanly and Auto Recovery is disabled. To resolve this issue, back up the files in the affected replicated folders, and then use the ResumeReplication WMI method to resume replication.

Additional Information:
Volume: C:
GUID: 56234A2C-C156-11E2-93E8-806E6F6E6111

Recovery Steps
1. Back up the files in all replicated folders on the volume. Failure to do so may result in data loss due to unexpected conflict resolution during the recovery of the replicated folders.
2. To resume the replication for this volume, use the WMI method ResumeReplication of the DfsrVolumeConfig class. For example, from an elevated command prompt, type the following command:
wmic /namespace:\\root\microsoftdfs path dfsrVolumeConfig where volumeGuid=”56234A2C-C156-11E2-93E8-806E6F6E6111″ call ResumeReplication

For more information, see http://support.microsoft.com/kb/2663685.

Sounded like an easy fix, so I went ahead and tried resuming replication as directed. That didn’t work though. Got the following:

The DFS Replication service stopped replication on the folder with the following local path: D:\SYSVOL_DFSR\domain. This server has been disconnected from other partners for 154 days, which is longer than the time allowed by the MaxOfflineTimeInDays parameter (60). DFS Replication considers the data in this folder to be stale, and this server will not replicate the folder until this error is corrected.

To resume replication of this folder, use the DFS Management snap-in to remove this server from the replication group, and then add it back to the group. This causes the server to perform an initial synchronization task, which replaces the stale data with fresh data from other members of the replication group.

Additional Information:
Error: 9061 (The replicated folder has been offline for too long.)
Replicated Folder Name: SYSVOL Share
Replicated Folder ID: xxxxx
Replication Group Name: Domain System Volume
Replication Group ID: xxxxxx
Member ID: xxxxx

Yeah, bummer!

Check out this Microsoft blog post for content freshness and the MaxOfflineTimeInDays parameter. You can’t simple remove SYSVOL from DFSR replication groups via the GUI as it is a special folder, so you have to work around. I found some forum posts and blog posts that suggested simply raising this parameter for the broken server to a number larger than the number of days its currently been offline (154 in the above case) and then resuming replication. I wasn’t too comfortable with that. What if any older changes from this server now replicate to the other servers? That could cause more damage than it’s worth. I don’t think this will happen, but why take a risk. What I really want is to force a replication onto this server from some other server. Do a non-authoritative replication basically. So I followed the steps in this article and that worked.

A non-authoritative sync is like a regular sync, just that it is rigged to let the source win. :p So all the existing files on the destination server are preserved. The event log gets filled with entries like these:

The DFS Replication service detected that a file was changed on multiple servers. A conflict resolution algorithm was used to determine the winning file. The losing file was moved to the Conflict and Deleted folder.

Additional Information:
Original File Path: D:\SYSVOL_DFSR\domain\Policies\{F4A04331-3C62-474A-A1CE-517F17914111}\GPT.INI
New Name in Conflict Folder: GPT-{4B7F4510-3C34-4A8A-A397-BC736AE5D9B6}-v55459.INI
Replicated Folder Root: D:\SYSVOL_DFSR\domain
File ID: {8189DA2F-DCBE-4755-A042-72154B648111}-v949
Replicated Folder Name: SYSVOL Share
Replicated Folder ID: 05ECDBE0-A0E6-4A9A-B5BE-7C404E323600
Replication Group Name: Domain System Volume
Replication Group ID: 46C8DB83-463F-459F-8785-DCB19231C52B
Member ID: 7A0E2DA6-4841-40A1-B02D-F5F341345B98
Partner Member ID: 7851F335-6824-4B0D-9978-5A5520ECD547

If you want to see where these files are now moved to check out this blog post. That post has a lot more useful info.

Thoughts on leaving things midway

When I was a kid I used to always finish whatever book I was reading. Once I reached college and was generally lost in life my reading habit took a turn for the worse and many a times I didn’t manage to finish what I began reading. I took this as a negative thing and much later when I became relatively less lost in life I started reading avidly again and tried to finish whatever I started. I didn’t read every book I bought – mostly because I was still sort of lost in life, but also because the books I started reading since college were mostly non-fiction and I just never got to reading them all due to changing interests. However, if I started a book, I did my best to complete it especially if it was fiction. 

I don’t know if this is a good decision though. I don’t have an answer either ways – I am just unsure. The reason why I take this habit of “not finishing a book” as bad is coz that’s what gets drilled into your head. If you start something but don’t complete, it’s generally frowned upon. Plus I read this essay as a child where the author said that young people do more coz they don’t have a choice – they are forced to do from school or parents etc and so they do what is told even if they don’t like, and generally manage to do something of it – but as we grow older we have choices and so become spoilt/ pampered and just give up at the first time something doesn’t go our way. 

I get these points but nowadays I also feel that maybe wasting time finishing something just coz we have to finish it is probably just a waste of time. Yes it’s an accomplishment that you don’t leave things half way, but maybe it’s better to just restrict this philosophy to stuff that matters? Like say if you are a person who does a half job of everything – then yes, not good! But maybe you try and do a good job of most things, and mostly succeed too, so perhaps it is ok to ignore it when it comes to some areas (such as reading)? I don’t know. 

If I am watching a TV show or movie and leave it midway I don’t chide myself. But I do when it comes to reading a book. That’s because reading a book is more effort than watching something, but end of the day both are entertainment after all. If the objective is to be entertained then why must I give more importance and suffering when it comes to reading?

One reason why I am thinking all this now is due to Audible. They have their Great Listen guarantee wherein if you don’t like an audiobook you can return it. That’s amazing coz sometimes I just don’t like an audiobook – not coz of the narrator or narration or quality etc, I just don’t like it. But since Audible is giving me permission to return it back I don’t have any guilt that oh I bought something and will be wasting money not listening to it. If I am not enjoying it, I can return it – period. There’s nothing Bad involved. Wish similar programs existed for eBooks too!

Recently for instance I started reading two books. “The 100 Year Old Man Who Climbed Out Of The Window and Disappeared” and also “The Winter Fortress”. 

“The 100 Year” is great in movie form, and sort of interesting in reading form. I guess coz that sort of content translates well to a movie structure with good score and camera work etc; while if you are reading it all the coincidences and luck get trite after a while. 

“The Winter Fortress” is a great non-fiction book about the efforts during World Wor II to destroy a factory in Norway that produced heavy water (used in making nuclear bombs). I read about a third of it and it was a great read. I didn’t know most of it. Then I got side tracked with some other stuff (father in law passing away) and I lost the flow. Now I am trying to get back into it and not in the mood coz I simply have lost the flow. I tried to cheat by purchasing the audiobook version but a) I am still not managing to get into the mood and b) the narrator wasn’t that great (I didn’t like his voice). But I was able to return the audiobook thanks to Audible and so felt no guilt, but I had a heavy heart deciding what to do about the eBooks. Finally I decided it was pointless wasting more time with these two books and so decided to move on. And thought I’d write this post too putting my thoughts down. :)

Part of me feels bad at leaving these two books midway. But (a larger) part of me is relieved at moving on coz I would just have been depressed trying to get “entertained” with these books and not getting anywhere. 

Recently I also finished hearing James Franco’s narration of Stephen King’s “The Dark Zone”. This book was nothing like I expected – coz it was quite detailed and the overall plot was simple but what mattered was the details and descriptions and thoughts etc – and while I struggled to finish through it, I didn’t let go because it was manageable. I knew it was only a case of me expecting something else, but the book was well written and narrated and I could hold on till the end. Sometimes it’s worth it; sometimes (like now) it isn’t. Just got to make a case by case decision I think rather than some overarching “policy”. 

TV Updates

Think it’s been a while since I posted any TV watching updates. :)

Broadchurch – Season 3

An amazing season and a wonderful end to a great show. Everything about this season was great. The plot, the characters, the music, just everything … you will be missed!

Medici: Masters of Florence

Happened to see this on Netflix and checked it out. Nothing great, but worth a watch if you have nothing else to do. The actor who played the main chap (Cosimo) had a dead look about himself. Not sure if that’s intentional. The music is great, including the opening title sequence (which is there on Apple Music). The stories were enlightening in that I wasn’t aware of how much interaction there was between the Church and banks. The show started slowly but became better as it progressed. I remember binge watching the last few episodes coz I was curious what would happen. 

Peaky Blinders

On the last episode of Season 1 currently. A good show so far – enjoying it. Not too fast but not too slow either. Good characters, great set/ lighting/ camera work. Good music. I began watching it coz it’s created by Steven Knight, who directed “Locke” which was an amazing movie. Cillian Murphy & Sam Neill are great. Looking forward to Tom Hardy in the next season. 

Baahubali

Wanted to put this somewhere and didn’t seem to be worth a blog post of its own. I re-watched “Baahubali: The Beginning” a few days ago – mainly to refresh myself and also for the benefit of my daughter who hadn’t seen it. Then saw “Baahubali: The Conclusion” yesterday. Both movies are filled with typical South Indian heroism where the hero can do just about anything and everyone else just watches in awe; and all the women are mesmerized by the hero, with the heroine usually being a strong/ arrogant character until she meets the hero after which he manages to “break” her (kind of like how you’d break a bull) and then she too is all smitten by him. The latter had an overdose of heroism, which frankly I couldn’t bear, but has a lot more special effects and some amazing scenes. Definitely watch it for these. Whatever I feel about the heroism, the director S.S. Rajamouli is a genius for envisioning this sort of stuff.

I guess I hate these heroism kind of movies coz I grew up watching these in Hindi and Tamil movies which were filled with these, and to my child mind that seemed to be how the world is – where if you are a good person you are a hero and have special powers and can do good and move mountains, and everyone looks up to you – but as I grew up and reality hit, I realized that the world isn’t like this. So my mind sort of revolts at this misinformation. I have nothing against super hero movies  coz these usually have an origin story or something that explains why the heroes are “super”; I am only against movies where regular heroes are just able to do super things for no reason except that they are a hero and this is a movie. That gets to me. I know movies are unreal and movies such as Baahubali are fantasy – but when it becomes too fantastical my mind is unable to digest it and I lose interest in the movie. 

Self-learning; picking up new stuff

I realize over time that I am not good at learning things. As in, if I have to pick up something because say it’s a new topic and I must read about it, or maybe there’s an exam/ certification I wan’t to clear and so must study for it – I just can’t do these kind of tasks. I am also not good at just picking up stuff by doing it – like say maybe learn Linux but installing a distro and spending some time with it. I just don’t work that way. 

I knew this from before but used to consider this a negative quality of mine, mixed with fears that maybe I am not good enough. But nowadays I realize that while it still is not a good way to be, that’s just how I am and there’s no point overly thinking about it. Just have to take it in the stride. 

Like now for instance – I attended a Citrix course some months back and want to do its certification. Thought I’d get the list of objectives and course material and read through it and prepare myself. But I am just unable to focus. Knowing this nature of mine I had previously tried setting up a Citrix lab to get a hang of stuff. While that was a better success than this current idea of reading, that too didn’t get to the point I want to because I am not good at creating my own objectives – especially when I know it’s a “fake” one. It’s sort of like how I enjoy walking, but ask me to do a treadmill or just walk outdoors for exercise and I can’t do it. I’ll walk if there’s a need to – I don’t hate walking, in fact I love walking and think I am quite good at it – but I am not going to go for walks just for kicks. Weather and mood permitting I might for a walk just to listen to some podcast or an audio book; but that wouldn’t be coz I want to walk, it would be coz I want to listen to something and walking will let me do it peacefully. 

This is a difficult situation to be in when you are an IT professional. If your workplace is one where there’s plenty of new projects happening or things to do, it is a good state coz I know I will jump into these and quickly pick stuff up and do wonders; but if your workplace is not of that sort then I will get bored and get into a rut soon – stagnating and becoming pretty useless. This nature leaves me at the mercy of my environment than letting me be a self-driven person. That sucks!

Anyways, time to go back and read Citrix. Enough distractions via blogging. :)

[Aside] Citrix VDI Best Practices for XenApp and XenDesktop 7.6 LTSR

This is an amazing document! Skimming through the PDF version and I am blown away. Some day when I have to make Citrix related decisions, this is the document I will be turning to. (Came across it via the Citrix blog, so thank you!)

There’s also a XenDesktop handbook but I haven’t read it yet. 

Rakshadhikari Baiju Oppu – a slice of life

Saw the mallu movie “Rakshadhikari Baiju Oppu” today. It was a delightful watch. Slightly long and the ending was kind of sad; but I loved it. It’s the sort of movie that doesn’t really have any story. It’s like the director/ story writers just captured a slice of life in a village and its characters (centered around a chap called Baiju, played by Biju Menon). The movie reminded me of Adam Sandler’s “Grown Ups”. They are not the same but very similar. Both movies, to me, have a similar feel – as if someone dipped into the water of and bottled a bit of it for us to see and enjoy. 

“Rakshadhikari” touches on many things. Friendship, sports, studies, life, love, failed love, new gen, old gen, happiness, sadness … and Baiju is sort of the central character in all of this. He is not the hero or main person or anything like that. He’s not a Rajnikanth :) just someone who is there and whom everyone looks up to, makes fun of, can depend upon … As one of his friends said before the intermission he is a lucky man who’s happy. People run around trying to find happiness – Baiju just is happy. It’s not like he is doing anything to gain respect or be happy – he just does what he likes and is. 

The movie isn’t preachy. Nor tries to take a side in old vs new or nature vs technology etc. It makes fun of FB and relationships over FB but at the same time highlights the benefits of social apps like WhatsApp that let two old friends keep in touch. Even the hospital that takes over the playground in the end isn’t portrayed in a negative light. Hospitals are useful and that is subtly mentioned in a scene. And the only message the movie ends with in the end is that all this progress and running towards wealth and career and ambition etc is good but we must not forget playgrounds and chilling out.. simple. The movie doesn’t even end on a high note like a typical “movie” might do – with some forced happy ending. Life isn’t always happy; it’s more sadness than happiness, one might say, but it moves on and you take it in that (sportsman) spirit and go along with it… and that’s how the movie too ends. 

Check it out! I liked it. 

[Aside] PVS Caching

Was reading this blog post (PVS Cache in RAM with Disk Overflow) when I came across a Citrix KB article that mentioned this feature was introduced because of the ASLR feature introduced in Windows Vista. Apparently when you set the PVS Cache to be the target device hard disk, it causes issues with ASLR. Not sure how ASLR (which is a memory thing) should be affected by disk write cache choices, but there you go. It’s something to do with PVS modifying the Memory Descriptor List (MDL) before writing it to the disk cache, and then when Windows reads it back and finds the MDL has changed from what it expected it to be, it crashes due to ASLR protection. 

Any how, while Googling on that I came across this nice Citrix article on the various types of PVS caching it offers:

  • Cache on the PVS Server (not recommended in production due to poor performance)
  • Cache on device RAM
    • A portion of the device’s RAM is reserved as cache and not usable by the OS. 
  • Cache on device Disk
    • It’s also possible to use the device Disk buffers (i.e. the disk cache). By default it’s disabled, but can be enabled.
    • This is actually implemented via a file on the device Disk (called .vdiskcache).
    • Note: the device Disk could be the disks local to the hypervisor or could even be shared storage to the hypervisors – depends on where the device (VM) disks are placed. Better performance with the former of course. 
  • Cache on device RAM with overflow to device Disk
    • This is a new feature since PVS 7.1. 
    • Rather than use a portion of the device RAM that is not usable by the OS, the RAM cache portion is mapped to the non-paged RAM and used as needed. Thus the OS can use RAM from this pool. Also, the OS gets priority over PVS RAM cache to this non-paged RAM pool.
    • Rather than use a file for the device Disk cache, a new VHDX file is used. It is not possible to use the device Disk buffers though. 

The blog post I linked to also goes into detail on the above. Part 2 of that blog post is amazing for the results it shows and is a must read for these and the general info it provides (e.g. IOPS, how to measure them, etc). Just to summarize though: if we use cache on device RAM with overflow to device Disk, you get tremendous performance benefits. Even just 256 MB device RAM cache is enough to make a difference.

… the new PVS RAM Cache with Hard Disk Overflow feature is a major game changer when it comes to delivering extreme performance while eliminating the need to buy expensive SAN I/O for both XenApp and Pooled VDI Desktops delivered with XenDesktop. One of the reasons this feature gives such a performance boost even with modest amounts of RAM is due to how it changes the profile for how I/O is written to disk. A XenApp or VDI workload traditionally sends mostly 4K Random write I/O to the disk. This is the hardest I/O for a disk to service and is why VDI has been such a burden on the SAN. With this new cache feature, all I/O is first written to memory which is a major performance boost. When the cache memory is full and overflows to disk, it will flush to a VHDX file on the disk. We flush the data using 2MB page sizes. VHDX with 2MB page sizes give us a huge I/O benefit because instead of 4K random writes, we are now asking the disk to do 2MB sequential writes. This is significantly more efficient and will allow data to be flushed to disk with fewer IOPS.

You no longer need to purchase or even consider purchasing expense flash or SSD storage for VDI anymore. <snip> VDI can now safely run on cheap tier 3 SATA storage!

Nice!

A follow-up post from someone else at Citrix to the two part blog posts above (1 & 2): PVS RAM Cache overflow sizing. An interesting takeaway: it’s good to defragment the vDisk as that gives up to 30% write cache savings (an additional 15% if the defrag is done while the OS is not loaded). Read the blog post for an explanation of why. Don’t do this with versioned vDisks though. Also, cache on device RAM with overflow to device Disk reserves 2 MB blocks on the cache and writes in 4 KB clusters whereas cache on device Disk used to write in 4 KB clusters without reserving any blocks beforehand. So it might seem like cache on device RAM with overflow to device Disk uses more space, but that’s not really the case …

As a reference to myself for later: LoginVSI seems to be the tool for measuring VDI IOPS. Also, yet to read these but two links on IOPS and VDI (came across these from some blog posts):

Time and all that …

This is something I wrote while killing time in the metro today … was in a bit of a “mood” so this is not one of my typical techie posts. Feel free to skip. You have been warned! :)

Listening to Stephen King’s “The Dead Zone” read by James Franco. I pre-ordered it after watching “11.22.63”. From the book blurb I thought it would be more sci-fi or horror, but so far it’s been slow, thoughtful, and quite well-written (yes I know I have no right to say that, just that I expected the book to be something else and am pleasantly surprised by what it has turned out to be). I don’t know where the story is going yet… there seems to be one main strand with a few little strands strewn over so far and am guessing they all intersect at some point. I am only some 3 hours into a 16 hour book, so plenty of time left! 

Listening to this book reminded me of ‘time’ from “11.22.63” (same author) as well as “Slaughterhouse-Five” (same narrator). Both talk about ‘time’ differently but with the same idea. Both books treat ‘time’ as frozen/ pre-determined and “11.22.63” especially has this idea of time fighting back if you try and change it. I liked that and wish the book had elaborated more on it. 

If you view ‘time’ as frozen (i.e. this moment has already happened, the future has happened) then ‘time’ is ‘fate’. The question of changing your fate or trying to change your luck then becomes a case of trying to work against ‘time’. Which is sort of interesting coz then you can see ‘time’ working against your efforts. I hate that but also find it fascinating because that makes ‘time’ or ‘fate’ kind of sentient or purposeful (like they are really “out to get you” :p). 

A long time ago I had come across Dilbert comics author Scott Adams’ “affirmations” concept. Basically you think of something you want and keep repeating that idea as a sentence many times a day. For example: “I will get a score of 100/100 in my exam on Saturday”. Write this sentence down say every day morning for say 20 times. That’s affirmations. The exact details are variable – as in maybe you could type it down or just say aloud to yourself; or maybe no need to do it in the morning but just at some point during the day or at regular intervals through the day… you get the point. I had tried it many years ago and nothing happened. At that point I felt maybe I wasn’t doing it correctly and so left it (and in fact later on things kind of turned out to be opposite to what I had wished for – story of my life! :p). Didn’t think much of it and left it. 

Some months ago I came across this idea again from one of his books and also a few podcast interviews. Tried it again this time, with more earnestness, and this time I felt there was a sudden “kick back” from time in terms of changing things such that the things I were affirming for were no longer possible. And then I saw “11.22.63” and the concept of ‘time’ fighting back entered my mind and it’s been sitting there since then. I’ve tried a few other things similar to affirmations (both before and after watching “11.22.63”) and every time there’s been a kick back – often a strong one to completely derail what I was wishing for. These kind of events reaffirm my thinking that time is frozen, and if you try taking a blow torch to thaw it a bit, it fights back! :) I guess words like “frozen” and “blow torch” are not the right ones – it’s more like the path is bound with strings tied to other strings in a sort of self-correcting machine mechanism, and if you try to make changes the mechanism kicks in and sorts things out to ensure you stay on path.

It’s a depressing way of thinking, but everyone has a path set out, and there’s not much we can do to budge from it. And in the few instances where we do feel we’ve managed to change things, that’s probably coz that change itself was written in the path.

[Aside] PVS vs MCS

Haven’t read most of these. Just putting them here for when I need ’em later.

GPO audit policies not applying

I didn’t realize my last post was the 500th one. Yay to me! :)

Had an issue at work today wherein someone had modified a server GPO to enable auditing but nothing was happening.

The GPO had the following.

And it looked like it was applying (output from gpresult /scope computer /h blah.html).

But checking the local policies showed that it wasn’t being applied.

Similarly the output of auditpol /get /category:* showed that nothing was happening.

This is because starting with Server 2008/ Vista Microsoft split the above audit categories to sub-categories, and starting with Server 2008 R2/ 7 allowed one to set these via GPO

My understanding from the above links is that both these sort of policies can mix (especially if the newer ones are not defined), so not entirely sure why the older audit policies were being ignored in my case. There’s even a GPO setting that explicitly let’s one choose either set over the other, but that didn’t have any effect in my case. (The policy is “Audit: Force audit policy subcategory settings (Windows Vista or later) to override audit policy category settings” and setting it to DISABLED gives the original policy categories precedence; by default this is ENABLED).

The newer audit policy categories & sub-categories can be found under the “Advanced Audit Policy Configuration” section in a GPO. In my case I defined the required audit policies here and they took effect.

Something else before I conclude (learnt from this official blog post).

By default GPOs applied to a computer can be found at %systemroot%\System32\GroupPolicy. Local audit policies are stored/ defined at %systemroot%\system32\GroupPolicy\machine\microsoft\windows nt\audit\audit.csv and then copied over to %systemroot%\security\audit\audit.csv. However, audit policies from domain GPOs are not stored there. This point is important to remember coz occasionally you might found forum posts that suggest checking the permissions of these files. They don’t matter for audit policies from domain GPOs.

In general it is better to use auditpol.exe /get /category:* to find the audit policy settings rather than an group policy tools.