Call4Cloud | MMP-C | Autopilot | Device Preparation

Troubleshooting the Properties Catalog Error 2147749902:

Patch My Pc | install & update thousands of apps

Troubleshooting error 2147749902 (WBEM_E_INVALID_NAMESPACE) isn’t always straightforward. What started as a simple Intune error turned into a Device Inventory deep dive involving an empty SQLite database, a failing Device Inventory Agent service, and a missing CIM namespace. This blog retraces my steps, tools, and solutions to get everything back on track.

Introduction

When everything works as expected, it’s a relief. Policies deploy smoothly, devices will get a linked Enrollment without any issues, and the Device Inventory deployment reports are exactly as they should be. But then there are times when things go sideways, and this was one of those times.

It started simply: we rolled out the Properties Catalog policy to gather the Hardware Inventory data across devices. Expectations were high, but the results? An unexpected flood of errors in the Inventory Policy report.

Device Inventory Properties Catalog report showing errors

Devices were showing no inventory, no data, well just nothing. Opening the Inventory Properties Catalog report, the error seemed straightforward: 2147749902 (WBEM_E_INVALID_NAMESPACE).

2147749902 (WBEM_E_INVALID_NAMESPACE

In a normal situation, a regular check-in to the service, AKA a sync, should fix it, right? Wrong. This issue was more than I bargained for. And so began our hunt to figure out:

  • Why the Properties catalog was still showing the 2147749902 error while it synced successfully
  • Why the Inventory Agent service wouldn’t start.

With ProcMon, dump files, and the agent’s code (DLLs) as our tools, we were determined to track down the problem. What followed was a fascinating journey through missing event logs, incomplete installations, and SQLite tables that refused to be born. Let’s retrace the steps we took, how we hunted down the cause, and what finally led us to the fix.

Step 1: Investigating the Inventory Agent Service

The first steps were pretty straightforward:

  • Checking the state of the Inventory Agent service. As shown below, the Microsoft Device Inventory Agent wasn’t running.
the microsoft device inventory agent was not running
  • Manually starting it threw errors, and the funny thing was that the inventoryadapter folder was missing (containing the MOF files)
inside the microsoft device inventory agent the inventoryadapter folder was missing
  • The Application event log will show that the InventoryService.exe process crashed when we manually tried to start the Service.(Kernelbase.dll)
inventoryservice.exe crashing the moment the service starts
  • The moment we manually created the InventoryAdapter, all files were placed in it but moved to one process.mof file.
A screenshot of a computer  Description automatically generated
  • Inside the Inventory service folder, we noticed that the SQLite database was 0 KB empty.
A screenshot of a computer  Description automatically generated

When opening the SQLite db, we noticed that it didn’t contain any tables or information

When doing the same thing on a working device, we noticed that the HarvesterDatabase contained two tables.

This suggested a failure in the table creation process during initialization and made us inspect the Harvester log for clues. (as the log folder was still intact)

Step 2: Analyzing Clues in the Harvester Log

It’s obvious that you always need to start by reviewing the logs. While doing so, we found a clue as to what was happening with the inventory service in the Harvester log. The error message: (which is repeated constantly)

(0x80004005) SQLite Error 1: ‘no such table: InventoryEventCollector’

no such table: InventoryEventCollector

This was a significant discovery. It confirmed the SQLite database existed but couldn’t initialize properly due to a missing table even while the procmon trace showed it was created successfully (I was expecting an access denied message and, with it, a 0kb file)

This anomaly suggested something deeper was wrong. Armed with this knowledge, we moved to validate the CIM namespace, suspecting that the failure to register the WMI class was part of the problem. Why? Because the Inventory Adapter folder that holds the MOF files was missing.

Step 3: Verifying the CIM Namespace with WBEMTest

The error 2147749902 screamed WBEM_E_INVALID_NAMESPACE, which typically means the CIM namespace doesn’t exist or has become corrupted. To confirm, we used WBEMTest to connect to the namespace:

  1. Opened wbemtest.exe from system context (psexec).
  2. Tried connecting to the root\MicrosoftDeviceManagement_Extensibility_Inventory namespace.
  3. Received the error: “0x8004100e Invalid Namespace does not exist.”
2147749902  0x8004100e Namespace does not exist.

While on a working device, I could open it without any issue at all!

The wmi namespace should have shown the inventoryadapter

Looking back at the 0x8004100e WMI error we got, it matches, the MDM declared configuration event logs and the 2147749902 in the Intune Portal.

0x8004100e unknown error code

I guess all these errors clearly told us something was clearly broken. Next, we grabbed the installer by finding the download location.

A screenshot of a computer  Description automatically generated

Once downloaded, we ran it manually. Failure. The installer told us that it couldn’t start the Microsoft Device Inventory Agent Service and that we needed to verify that you have sufficient privileges to start system services.

microsoft device inventory agent failed to start

This confirms that something deeper was broken. We also attempted to manually re-register the CIM provider using the MOF compiler command:

A screenshot of a computer error  Description automatically generated
A screenshot of a computer screen  Description automatically generated

Even though this command succeeded, the agent itself still failed to install, confirming that the CIM provider wasn’t the root cause. It was time to look under the hood.

Step 4: Dissecting DLL Behavior to Trace Failures

With manual installation failing and the CIM namespace missing, we turned our attention to the Inventory Agent DLL files.

A screenshot of a computer  Description automatically generated

The goal was to understand the service’s flow and pinpoint where it was failing. By decompiling the agent’s code, we discovered that it will initialize the service essentials.

  1. The service starts by reading the appsettings.json file.
  2. It will initialize the Loggers (etw/file)
How the device inventory agent initializing has some essential requirements to do so

Once the service essentials are taken care of, it attempts to create the SQLite database and tables using the event logs as input. (Remember? The SQL database was created but empty with no tables.)

the harvesterdatabase should create the tables if they dont exists

Step 5: Debugging Initialization Essentials

With now knowing the inventory essentials, we decided to check the appsettings.json file in: C:\Program Files\Microsoft Device Inventory Agent\InventoryService\

I was expecting this file to be empty, despite the procmon trace showing it was created. I know from personal experience that messing around with this file could break the service (believe me…)

To our surprise, the file was present and fully populated. Key configurations like FrequencySettings, ValidationProcessingEnabled, and EventLogger settings were intact:

A screenshot of a computer program  Description automatically generated

This told us that the installer had partially succeeded—enough to drop files like appsettings.json but the service was still failing.

This insight helped us focus our efforts. Since the Harvester log pointed to a missing table, and knowing that the service was able to read the appsettings.json there must be something going wrong between reading the appsettings and registering the mof provider (which also worked manually).

The next service essential was the Logger!

Step 6: Comparing Device Behavior with ProcMon

Using ProcMon, we compared the behavior of a working device against the broken device while installing/reinstalling the agent

What we found:

  • On a working device:
    • wevtutil.exe successfully processed the EventSchema.man file, creating the event log publisher
  • When the service starts, a new event log should be registered under:

Microsoft/Windows/DeviceManagement-Enterprise-Diagnostics-Provider/InventoryAgent

  • When deleting that event log, it will be recreated automatically
    • The Inventory Agent service initialized and created the missing SQLite tables.
  • On a broken device:
    • wevtutil.exe attempted to process EventSchema.man and succeeded. Also, manually trying to register the publisher
A screenshot of a computer  Description automatically generated

This confirmed that the publisher was created, but the Inventory event log was somehow not. As shown below, this is what it should look like!

the Inventory agent event log was missing on the device

Step 7: The Missing Link – Event Log Registration

When the service starts, it will check if the inventory agent event log has been created.

If that event log is not created, the inventory agent service should create it itself. We can spot this in the code as well.

the code inside the device inventory agent showing that when the service is started it should create that event log itself

This was a critical piece of the puzzle. Without this event log, the inventory event table will not be created. Without the table creation failing, the SQLite DB couldn’t be initialized, and with it, the service would not start.

It was time to find out if I could now break it on a working device! To do so, I deleted the SQLlite database and changed some permissions on the Inventory Event log.

denying authenticated users to reproduce the issue

By denying authenticated users and restarting the service. Guess what error it got me!

0x8004005: no such table inventoryeventcollector.

Success! I managed to replicate the issue by deleting the SQLite database and altering permissions on the Inventory Event log.

Step 8: Fixing the Inventory Agent with PowerShell

Now that I knew how to break it, it was time to try to fix it! I decided to write a .net PowerShell script to ensure the weird-looking event log would be created. (the / in the path were a bit difficult to deal with when not using .win32.registry)

A powershell script to fix the 2147749902  error

With the power of drift control, the declared configuration service will kick off the MSI installation once more, and with the event log on the device, it will succeed!

Ensure you are deploying the PowerShell script as a system… and please don’t try to install the MSI manually or temper with it as I did; otherwise, you will end up with this error: Different ACLs were found, and again, no data upload.

Different acls were found in the inventoryagent event log

My weird Paint 2147749902 flow

With the issue fixed, let’s dive into how I troubleshoot. When I start troubleshooting, I use Procmon. I compare the traces with a working and a non-working device. With that flow, I use the DLLs to find the moment it breaks and what should have happened! All of that information will be saved in a simple MSPaint picture, so I can quickly draw lines and connect stuff

Finding the root cause!

While trying to determine the root cause and trying to fix it in different ways, I also tried to create the InventoryAgent event log by using the new-eventlog PowerShell command on a device that was missing the agent. This is what it got me!

Only the first eight characters of a custom log name are significant, and there is already another log on the system using the first eight characters of the name givin

Only the first eight characters of a custom log name are significant, and there is already another log on the system using the first eight characters of the name given.

This Microsoft Learn article sums up this behavior pretty well. The only real fix is to give the event log a unique name.

Only the first eight characters of a custom log name are significant, and there is already another log on the system using the first eight characters of the name given

But…. did you spend time looking at the error I showed before?

It mentions that it stumbled upon the event log called: Microsoft Information Protection! The moment you have the Microsoft Information Protection / Microsoft Purview Information client tools installed on your device, it seems to break the device inventory agent installation.

Please Note: You need to look at the Full Name as shown above in the registry or the log event properties itself. For example, the Microsoft Office Alerts with the full name: OAlerts is not going to give you any issue!

Working together with MSFT to fix it!

This screenshot below speaks for itself! As you can see, Microsoft is aware of the issue, clearly pointing to the event log as the culprit and confirming that a fix is on the way. In the meantime, you can deploy the PowerShell script I shared earlier to resolve it!

Conclusion: A Missing Event Log Crippled Everything

This investigation highlighted the critical role of event logs in the Inventory Agent’s startup process. From the Harvester log’s SQLite error to the missing namespace in WBEMTest, every clue pointed us toward the root cause: a failure in the event log creation flow.

With ProcMon, WBEMTest, and the agent’s source code, we traced the issue step by step and restored the service.

Key Takeaway:
A sync would normally be sufficient when faced with 2147749902 errors and empty inventory reports, but in this case, it wasn’t. If you want to fix it, we need to dig into the logs, validate namespaces, and check dependencies like event logs and configuration files. Sometimes, the smallest missing piece can bring down the entire system.

Leave a Reply

Your email address will not be published. Required fields are marked *

8  +  2  =  

Proudly powered by WordPress | Theme: Wanderz Blog by Crimson Themes.