New VS2015 Update 3 Runtime breaks MFC apps built with VS 2015 Update 2

Update: 07/21/2016: This issue is fixed! As noted on the connect bug in a comment made by James McNelis, “this is fixed by the update to Visual Studio 2015 Update 3 that was made available on July 20, 2016. We would advise all of our developer customers to move forward to this update.”

An earlier commenter noted, “This issue has been fixed by KB3165756 version 14.0.25424.00, released on 07/20/2016. The member m_bIsDragged has been removed from CMFCToolBarButton. The fix contains the new VC Runtime 14.0.24212.0. When you install this on a client machine MFC apps built with VS 2015 Update 2 will run without issues.” The comment further asks when the new runtime would be deployed via Windows Update.   James replied that there are no plans to do so.

So if you do deploy apps  targeting VS 2015 Update 3, please ensure you use the 14.0.24212 runtime included with this July 20th update. (previous one was 14.0.24210)

How to get this patch?  If you’ve already installed Update 3 do not use the Update 3 installer again, it won’t detect that anything needs installing.  You must use the patch located here instead:

As noted on the above page, the fix for this issue is included in the July 20th update.

Original blog post (written before the fix was made available):

The following is courtesy of a connect bug that was recently filed.

There is also an MSDN forum thread about this:

Credit for below goes to the original poster, none of this is mine  (the text has been copied verbatim from the connect bug).  Just passing the info forward to show that binary breaks (due to DLL Hell) still occur after all these years.

When you build an MFC app with Visual Studio 2015 Update 2, which creates a temporary CMFCToolBarButton object on the stack, and run it on a machine with VC Runtime 14.0.24210.0, which comes with VS 2015 Update 3, then the app is broken.

In a Debug build you get this error:
“Run-Time Check Failure #2 – Stack around the variable ‘ToolbarButton’ was corrupted”

In a Release build the reaction depends on what on the stack is overwritten. In my case the app doesn’t start at all.

The problem is caused by the new BOOL member m_bIsDragged in class CMFCToolBarButton. 
So memory layout differs between Update 2 and 3.  When initializing m_bIsDragged in the constructor, the (stack) memory behind the ToolBarButton is overwritten.

My thoughts: this type of bug is difficult to fix because some people may have already taken a dependency on the new MFC that shipped with VS2015 Update 3.  Now what happens if they try to revert this back to the Update 2 header signature for existing apps (i.e. get rid of m_bIsDragged).  Now in order to really solve this properly they have to change MFC to be dynamic, somehow.  I think they could somehow check, at runtime, what MFC version the app has built with, and do something fancy to avoid this.  I don’t think it’s going to be easy though.    Or they could sacrifice the few for the many (and just backtrack to Update 2 definition), or just forget this happened and force all apps to upgrade to the new signatures (worst solution).

The fact that this problem still occurs tells me that binary compatibility is not being checked actively.  Adding members to MFC headers is a big no-no.

That is kind of scary to me as you could have a critical app that is shipping, that just breaks due to new DLLs having been provided by some other third party, or even Windows Update (for security updates).    Again the only way to avoid this is using applocal DLLs, but that kills security and is error prone, and doesn’t do anything for those people that already shipped, expecting MFC to do the right thing between updates.


How do I service the Universal CRT if a bug is encountered?

Recently a serious bug in the Universal CRT was discovered that breaks all MFC apps due to MFC’s use of _sntscanf_s in its DDX_Text routine for doubles.

This raises an interesting point.  The bug is in the Universal CRT.  No longer can you just grab a new vcredist_x86.exe or new runtime DLL and plop it into your app folder (along with an applocal MFC of course) You now have to worry about the fact that this bug is in a system component, ucrtbase.dll.   This is due to the “great refactoring” of the CRT:

So then, how do we service ucrtbase.dll?  Do we just wait for it to show up in Windows Update?  Get the Universal CRT SDK and build a redist?

One possible answer lies here:

Answer is: applocal distribution (in the same folder as your app) They originally prohibited this from applocal distribution, but then change their minds.  This is a problem for apps that have multiple folders.

Note: in order to do this applocal distribution properly, you cannot simply just include ucrtbase.dll.  You have to include a series of 23 other flies, named api-ms-win-core.*.dll, a list of which can be found here.  Ugly, but it works.

But, according to Microsoft, from the second comment on this blog post:

“On Windows 10, the real Universal CRT in the system directory will always be used, even if you have the Universal CRT DLLs included app-locally”

So on Windows 10 how do I fix a bug in ucrtbase.dll?  Do I require a Windows Update to get that serviced?  Seems like it. In other words not possible to ship an app that’s totally self contained and has all bug fixes.

Can we call this DLL Hell 3.0?

How to target XP with VC2012 or VC2013 and continue to use the Windows 8.x SDK

One of the limitations of the Microsoft provided solution for targeting XP while using Visual Studio 2012 (Update 1 and above), or Visual Studio 2013, is that you must use a special “Platform toolset” in project properties that forces usage of the Windows SDK 7.1 (instead of Windows 8.x SDK which is the default).  The other function the platform toolset provides is that it sets the Linker’s “Minimum Required Version” setting to 5.01 (instead of 6 which is the default).  But that function can just as easily be done manually by setting it in project properties.

So how about the first main function of the platform toolset?   Setting the platform toolset to one that targets XP does the following:

(1) Changes the Platform SDK being used from Windows SDK 8.x (8.1 with VC2013 and 8.0 with VC2012) back to Windows SDK 7.1

(2) Adds a preprocessor define:  _USING_V110_SDK71_ to the build

The second one turns out to be important, due to a piece of code in atlwinverapi.h, namely the following:

extern inline BOOL __cdecl _AtlInitializeCriticalSectionEx(__out LPCRITICAL_SECTION lpCriticalSection, __in DWORD dwSpinCount, __in DWORD Flags)
#if (NTDDI_VERSION >= NTDDI_VISTA) && !defined(_USING_V110_SDK71_) && !defined(_ATL_XP_TARGETING)
     // InitializeCriticalSectionEx is available in Vista or later, desktop or store apps
     return ::InitializeCriticalSectionEx(lpCriticalSection, dwSpinCount, Flags);
     // ...otherwise fall back to using InitializeCriticalSectionAndSpinCount.
     return ::InitializeCriticalSectionAndSpinCount(lpCriticalSection, dwSpinCount);

As you can see, if we do not use the platform toolset that defines _USING_V110_SDK71, i.e we don’t use the Windows SDK 7.1, then we don’t get the benefit of avoiding a call to InitializeCriticalSectionEx, which is a function only available on Vista and above.  This will cause your binary to not load on XP.

But what if we really want to use the Windows 8.x SDK (taking care, of course, that we don’t call any Windows 8.x functions directly, to keep support for older operating systems). Why would we want this? For example, we may want a structure definition, some preprocessor define, or function declaration, i.e. we may want to support some feature of Windows 8.x if actually running our app on that OS.

Say we’ve decided to use Windows 8.x SDK while still allowing our app to run on XP. Are there any options available? i.e. can you keep using the v110/v120 toolsets instead of the v110_xp/v120_xp toolsets? Yes, it turns out that Microsoft left a nice loophole in the code to do exactly that. Notice the mysterious define in the block of code above named _ATL_XP_TARGETING. Turns out this is an alternative way to support XP targeting while _USING_V110_SDK71_ is NOT defined. So if you really want to support XP while using Windows 8.x SDK, we simply need to ensure our code is built with _ATL_XP_TARGETING defined. The easiest way to do this is to add a /D_ATL_XP_TARGETING flag to our C/C++ command line options in project properties.

Then, the only other step is to set the “Minimum Required Version” in project properties under Linker – System to 5.01, and we’re all set – a simple way to target XP and still use the Windows 8.1 SDK without using the platform toolset that Microsoft provided to target XP.

In summary, the _ATL_XP_TARGETING define, while undocumented, is an interesting way to keep support for XP while also allowing continued use of the Windows 8.x SDK (rather than being forced to be permanently stuck on  the older Windows 7.1 SDK)

Was forbidding desktop applications on Windows RT the correct move?

Every day, someone bugs me at work about the news reports relating to how the Surface RT was not the success Microsoft had hoped it would be.  And Asus had the same conclusion.   I tell them Surface RT was a great piece of hardware.  It had the misfortune of not being able to run desktop apps.

But what about if Windows RT (Windows ARM) had allowed desktop applications (that have been recompiled to use the ARM instruction set) to run instead of just the ones that Microsoft had allowed (i.e. Office, and several built in apps, such as notepad, calculator, and some remote debugging tools).

I’ve been following the threads on xda developers on how the digital signing of desktop apps was circumvented to allow desktop apps to bypass this check completely, and in effect, open it up for desktop development.

Apart from the exploit, how was this possible at a tools level?  Well, if we go back to //build (in 2011), and look at the Beta version of Visual Studio 2012, you’ll notice something interesting: a complete version of MFC for ARM (including static lib files).  You’ll also notice that several key Windows SDK libraries were excluded (such as common controls, etc), making these MFC libraries more difficult to take advantage of.

The question remains: why would they have included MFC if they hadn’t planned on allowing developers to make desktop apps targeting ARM? My theory is that the original plan was to allow development of desktop apps for ARM, but at some point it was decided that desktop apps should be controlled and their creation only made available to Microsoft themselves.  Hence, when the first developer preview came out at //build there were still remnants of the original plans.

After this first build, subsequent builds removed the MFC libraries that were included in that first Beta.

So coming back to the original question: was forbidding desktop apps on ARM the correct move? At the surface level (pardon the pun) it looks to be a good decision.  the WinRT (Metro) environment provided for store apps, is tightly controlled, and therefore can have a better impact on battery life, potential viruses, app revenue, etc.  But it does stifle competition.  Look at VLC for ARM right now.  They can’t make a desktop app so have been forced to go to kickstarter to fund a Windows 8 (Metro) version of their app.  It’s still under development due to the tight controls over which APIs are allowed in WinRT apps. It’ll be interesting to see when, if ever, they release something.

Imagine the ability to have Chrome or Firefox for your Surface RT? Good thing?  I’m not sure.  Or your favorite app, which only needs to be recompiled for ARM using Visual Studio, and released directly from the software developer rather than through the store.  Good thing?  Hmm, the pros and cons are hard to weigh.  If you are starting from a zero ecosystem and trying to build it up, maybe desktop apps would have been a good thing.  But on the other hand it might have caused developers to focus less on the Windows Store apps and more on their legacy desktop apps.

But judging from the developer buy-in for WinRT (Windows Store apps), it’s a moot point.  The fact is, there hasn’t been enough buy-in.  And this is the key to success of the ecosystem, having developers risk getting no revenue from a store app, vs continuing with their legacy desktop apps on Intel only.  If they had allowed desktop apps on ARM, it’s possible more of those developers would be more excited about Windows RT.

New in Windows 8.1 store apps: a way to separate your app from your resources

One of the biggest complaints about the Windows 8 Windows store app approach to dealing with localization (separate translations for each language you decide to support), was the inability to decouple the various localizations from the main app.

As I’ve talked about in previous blog posts, the satellite DLL approach to Windows desktop apps, is an excellent one that can be used successfully with a lot of manual work (and can be automated quite easily when targeting Vista and above platforms).  But in Windows 8 store apps, there was no real analogy to this.

Windows 8.1 introduces a new type of package, a resource package.  MSDN describes it well here, I’ll provide a brief summary:

A resource package is a subset of your app that is used to provide language, scale, and DirectX features.  When you deploy an app to a machine, the decision is made whether you need one or all of the resource packages.  The app package itself can be deployed to a user’s machine with none of the resource packages, one of them, or all of them, depending on the particular needs of that machine. This is great for 2 reasons: it potentially increases download speed and reduces disk space.

An app bundle manifest (.appxbundlemanifest) is what describes your app’s package and all its resource packages.

The great thing about this new system, is that Visual Studio 2013 automatically handles this for you (separates the resources into separate resource packages).

There is also a package API that allows you to get information about packages, and a sample has been prepared by Microsoft and is found here:

as well as another great sample that shows you how this resource package approach could be used in a game:

If you’re ok with targeting Windows 8.1 for a future Windows Store app (see previous blog posts on pros and cons of targeting Windows 8 vs Windows 8.1), this is an excellent new system that I believe will be a great boon for developers.

A note about Stroustrup’s The C++ Programming Language 4th Edition

As far as C++ books are concerned, this is the definitive reference, from the inventor of C++, Bjarne Stroustrup.  I grew up with his 2nd, 3rd, and Special Editions, and I highly recommend that you take a look at the 4th for its great C++11 content.

Now, I don’t recommend you buy it right away.  Yes, I know you may be surprised by that statement, but let me explain. Stroustrup is one of those authors that takes accuracy seriously.  Due to that, he tends to go through many “printings” of his books.   He makes changes (corrections) in each of these printings, based on reader feedback.

Take a look at the errata for his 3rd and special editions:

He went through 21 revisions of the 3rd edition, and 14 revisions of the special edition (which was basically a continuation of the 3rd edition, except in hard cover).  So a total of 35 different 3rd editions when you include the special editions (disclaimer: some of these overlapped, i.e. there were early special edition printings equivalent to 3rd editions in higher printings, but you get the idea)

Why am I mentioning this?  Because the 4th edition is very young.  He’s already up to the 3rd printing after a couple of months:

Also, complaints have been made about the flimsy (physical) nature of the original release of the 4th edition.  Yes, it’s a paperback.  However, it looks like Addison-Wesley has heard the complaints because a hardcover version of the book will be released on July 29th!

So I recommend you take a look at buying the hardcover version after a few months of revisions.  You’ll have the majority of “big” fixes, and then you can follow along the errata pages for future fixes.

How do you know what printing you’re going to get if you order from somewhere online?  It’s hard to know, you could get a very early one based on stock. But amazon tends to go through stock quickly.  Another thing you could do is to get one from your local bookseller, and take a look on the inside to see what printing you’re getting.

Visual Studio 2013 support for targeting Windows 8

I’ve been working my way through the multitude of //build 2013 session videos on channel9, and I came across an interesting presentation:

Upgrading Windows 8 Apps to Windows 8.1

There is a lot of really great information about the gotchas for deciding on making an app that is specific to Windows 8.1.

The main point you need to remember: once you make your app target Windows 8.1 (e.g. by converting to an 8.1 app and taking advantage of 8.1 specific APIs), your app will not run on Windows 8.  On the other hand, if you target Windows 8, your app will run under both Windows 8 and Windows 8.1

Here’s the kicker: Visual Studio 2013 will NOT support creation of new Windows 8 store apps, you’ll only be able to create Windows 8.1 store apps.   However you will be able to edit and build existing Windows 8 projects with Visual Studio 2013.

So if you want to continue to target Windows 8 when creating new store apps, you are going to need both Visual Studio 2012 and Visual Studio 2013 installed. You’ll really only need Visual Studio 2012 to create the project, and once it’s created you can switch over to Visual Studio 2013.

This seems to me to not be a technical limitation, but more of a way to encourage developers to target Windows 8.1 from the get go if creating a new app.

To me it would make more sense to support creating Windows 8 store apps in Visual Studio 2013, since the infrastructure is already there to continue to edit and build existing Windows 8 projects.