Dataverse, limits by publish, are you serious?

Author: Valentin Gasenko, Senior Power Platform Developer

Hi there!

An extremely funny story happened to me on the project.

I understand that hardly anyone will have all the stars in such a sequence, but I can’t help but tell about this case.

I have 2 environments in which development is taking place. DEV1 is in the UAE region, DEV2 is in the GBR. This is because the DEV1 team is based in the UAE and is developing there, and the DEV2 team is developing Portal solutions, the servers for which were unavailable in the UAE at the beginning of the year.

And the point is that for all the development that the team is doing on DEV1, I have to import on DEV2, but all the components have to be as well unmanaged.

That is, in other words, DEV2 is a clone of DEV1 + PowerApps Portal part.

We have a release once a week, i.e., transfer development from DEV1 to DEV2 early Saturday morning via release pipeline. And everything was fine, everything was great until the partner team published several thousand custom components on DEV1 at once. These were hundreds of entities, hundreds of plugins, processes, web resources, and more. These components added up to a total of 2101.

No big deal, I thought. I asked the guys to separate these components into several solutions – entity customizations, plugins, reports, and so on. I’ve got 10 solutions, each between 100 Kilobytes to 2 megabytes in size.

And so at the hour of X I transferred all these unmanaged customizations from DEV1 to DEV2. At the end of my pipeline was the Publish All step.

I execute it and it gives me:

Number of parameters in a condition exceeded maximum limit

I go to the CRM, and try to do Publish All through the UI – the same error.

I start frantically googling. Google tells me some nonsense about some limits on the SQL Server side. What limits, what server? I have a cloud solution, right?

Yes, but…

I contact technical support and they tell me that: yes, there is indeed a limit for publishing custom components at the same time. There must be no more than 2000 of them.

I was horrified! How could it be? I’ve never heard about it before!

But how could I fight this bug? And they told me, that out of 2101 components, which you have in the queue, 101 need to be published manually. I.e., go to the entity and press Publish, then switch to the next one and press Publish, etc. And so on 101 times, until the limit drops to the allowable 2000.

While I was swearing and humbly publishing 101 entities manually, it turned out that such a limit was only on the GBR region Dataverse! It turned out that when creating the configurations for that region (not only my env – for all regions!), the guys at MS hadn’t “turned something off” there and ended up keeping the old limit that I ran into. A more detailed description of what is happening:

No alt text provided for this image

Now everything is working properly, and everything is publishing well, but the moral of this fable is this:

1) always split large solutions into smaller ones;

2) if you’re publishing via Pipeline, do a Publish All step after each small solution (especially if it’s unmanaged). Not at the end, but after each import.

Good luck and cloudless development, everyone!

Comments are closed.