Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression: dotnet test hangs for 15 minutes after test run completed on mac and linux #9452

Open
AArnott opened this issue Jun 1, 2018 · 51 comments
Milestone

Comments

@AArnott
Copy link
Contributor

AArnott commented Jun 1, 2018

Our builds on VSTS include dotnet test, which restores, builds, and tests our product. With dotnet CLI 2.1.100 all went well. But when we upgraded to 2.1.300, the dotnet tool doesn't exit for 15 minutes after the test run is completed with no output to explain the delay.

Expected behavior

The dotnet tool exits immediately, as can be seen on this 2.1.100 run:

image

Actual behavior

The dotnet tool exits after waiting 15 minutes, as can be seen on this 2.1.300 run:

image

Steps to reproduce

Please contact me via my microsoft.com email address for links to the builds if you'd like to investigate further.

@livarcocc
Copy link
Contributor

I am not sure what you are doing in VSTS, but I suspect this might be long running servers that are still live when the test finishes.

In 2.1.300, we now have a long running VBCSCompiler and a MSBuild node re-use server and a Razor compilation server.

They all stay around for some time to avoid JITing and improve perf if you attempt a build again.

You can turn them off by running dotnet build-server shutdown after the test execution. Or, you can set msbuild properties/env variables to prevent them from running to begin with.

@AArnott
Copy link
Contributor Author

AArnott commented Jun 1, 2018

I suspect this might be long running servers that are still live when the test finishes.

Perhaps. But why would that prevent the tool from exiting?

You can turn them off by running dotnet build-server shutdown after the test execution

How would I run that on VSTS after test execution, since test execution itself doesn't return control to the next step in my build till the 15 minutes are over?

Or, you can set msbuild properties/env variables to prevent them from running to begin with.

I'm willing to try that. Where do I find the variables to set? And if it works, is that a bug in dotnet CLI then that it's waiting for those servers when the work is finished anyway?

@livarcocc
Copy link
Contributor

@peterhuene can you help @AArnott in getting the properties to turn off the long running servers?

@AArnott I don't understand enough about how VSTS does test execution to know why it would consider as if test was not done in that case.

@peterhuene
Copy link
Contributor

peterhuene commented Jun 1, 2018

Use -p:UseRazorBuildServer=false to disable the Razor (rzc) server.

Use -p:UseSharedCompilation=false to disable the Roslyn (vbcscompiler) server.

For MSBuild, pass /nodeReuse:false on the command line to disable node re-use.

I don't know of anything that might be waiting on the servers. There was an issue where MSBuild might hang with node re-use enabled on non-Windows platforms (dotnet/msbuild#3161), but that was fixed for RTM.

Would it be possible to capture the stacks of the hang?

@jskeet
Copy link

jskeet commented Jun 3, 2018

FWIW, I'm seeing hangs on Linux for dotnet build unless I disable node reuse too. This happens on a machine where I'm running BenchmarkDotNet benchmarks, which does its own builds as well - so it's possible that something in that is jamming a build node. (My next step will be to find a way of disabling build reuse globally...)

jskeet referenced this issue in jskeet/nodatime Jun 3, 2018
This is to work around a Linux issue with .NET Core 2.1
https://github.com/dotnet/cli/issues/9397

(So far we haven't seen it hurt the Travis build, which may have it
disabled by default.)
jskeet referenced this issue in nodatime/nodatime Jun 3, 2018
This is to work around a Linux issue with .NET Core 2.1
https://github.com/dotnet/cli/issues/9397

(So far we haven't seen it hurt the Travis build, which may have it
disabled by default.)
@AArnott
Copy link
Contributor Author

AArnott commented Jun 3, 2018

I can confirm that setting MSBUILDDISABLENODEREUSE=1 on mac and linux removes the hang at the end of the build+test invocation as well.

@AArnott
Copy link
Contributor Author

AArnott commented Jun 3, 2018

FWIW, I vaguely recall a bug in a Windows process my team owned where a parent process spawned my child process, and so long as my child process was running the parent process never entirely exited. We fixed it (I don't remember how but could potentially find out). I wonder if something similar is happening here, where dotnet test (which builds first) spawned the reusable node processes and children (instead of as top-level processes perhaps?) and now the parent process cannot exit till the reusable nodes timeout and exit. In that case, perhaps the fix is the change how the child processes are spawned so that they do not retain a child relationship with the parent.

@peterhuene
Copy link
Contributor

@rainersigwald Some reports of hitting hangs with node reuse above.

@rainersigwald
Copy link
Member

This sounds exactly like what dotnet/msbuild@e604f5f addressed. But evidently that was incomplete or doesn't work everywhere.

@jskeet @AArnott what's the best way for me to repro your cases? Can I just clone nodatime (before nodatime/nodatime#1157) and run build/runbenchmarks.sh?

@jskeet
Copy link

jskeet commented Jun 4, 2018

@rainersigwald: I believe that should repro it, yes. You'll need to specify the targets to run - netcoreapp2.0 netcoreapp2.1 is what I do normally. It's slightly harder to tell as I've normally run it as part of a cron job. I'd be happy to try it beforehand if that would help. Note that it's expected that it takes multiple hours, but there should be a lot of output a lot of the time.

When it's broken, you should be able to run dotnet build anywhere (e.g. in src/NodaTime) and it'll hang.

@rainersigwald
Copy link
Member

@jskeet Oh, that's interesting. It repros with dotnet build and not just dotnet test?

I think I understand the dotnet test problem: I think we're accidentally running the before-test build within the context of the

https://github.com/dotnet/cli/blob/7ce47778b4a5238db133cd0d4ffc3aa2695e406c/src/dotnet/commands/dotnet-test/Program.cs#L99

environment, leading the build nodes to be created in a hangy way.

But that wouldn't apply to build so I'll have to try debugging through the nodatime repro.

@jskeet
Copy link

jskeet commented Jun 4, 2018

@rainersigwald: I don't run dotnet test at all (in this workflow, or on that machine) - it repros just using dotnet run with BenchmarkDotNet, but that does some more builds in the background. I don't know all the details of what it does, but I doubt that it calls dotnet test.

But once it's in the "broken" state, then yes, dotnet build hangs - as does dotnet restore. (Specifying -nodeReuse:false on the command line makes both work.)

I'm happy to try to get it into the broken state again and then poke around - are there diagnostics that would be useful to try at that point?

@reduckted
Copy link
Contributor

I just want to chime in and say that I'm experiencing this issue as well. I haven't been able to reproduce it reliably yet, but I'll keep trying and once I do, I'll post a link to a repo here.

The scenario I have is a unit test that is testing some MSBuild targets files. This involves calling dotnet build during the unit test. This is basically what it does:

  1. Write a project file that uses the targets to disk.
  2. Compile the project using dotnet build.
  3. Look at the output from the build and assert some stuff.

When running this via dotnet test, it sometimes hangs for 15 minutes after the tests complete (the test duration reported is only a few seconds), but sometimes it doesn't hang at all.

@rainersigwald
Copy link
Member

@reduckted You should be able to resolve that by calling dotnet build -nodereuse:false inside your test. What's happening for you is that the outer dotnet test sets the environment variable that MSBuild uses to decide whether to attach the console or not. That is propagated down through child processes to the test process and then to the test's child dotnet build process, which then starts with node reuse but without the safe disconnect-console node startup. Explicitly specifying "no node reuse" should cause the worker nodes to exit immediately after the build, allowing the wait-on-child-process code to complete quickly.

@reduckted
Copy link
Contributor

Thanks @rainersigwald. 👍 I had tried that after reading some of the earlier posts and the hangs appeared to stop, but since I couldn't reliably reproduce it to begin with, I wasn't 100% sure it was the solution.

@GeorgDangl
Copy link

FYI, this is happening for me on Ubuntu 18.04 and with dotnet 2.2.300. Passing -nodereuse:false to dotnet test solves this problem for me.

/usr/bin/dotnet test --test-adapter-path . --framework netcoreapp2.2 --logger xunit;LogFilePath=/home/.../output/testresults-linux.xml -nodereuse:false

@wakuflair
Copy link

wakuflair commented Dec 20, 2019

I can confirm that this still happens on .NET Core 3.1

ubuntu@VM-0-4-ubuntu:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.1 LTS
Release:        18.04
Codename:       bionic
root@195d912bb800:/# dotnet --version
3.1.100
Test run for ***********************(.NETCoreApp,Version=v3.1)
Microsoft (R) Test Execution Command Line Tool Version 16.3.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...

A total of 1 test files matched the specified pattern.                      <---- hangs forever

-nodereuse:false doesn't work either.

@msftgits msftgits transferred this issue from dotnet/cli Jan 31, 2020
@msftgits msftgits added this to the Discussion milestone Jan 31, 2020
@vraravam
Copy link

I can confirm that this still happens on .NET Core 3.1

ubuntu@VM-0-4-ubuntu:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description:    Ubuntu 18.04.1 LTS
Release:        18.04
Codename:       bionic
root@195d912bb800:/# dotnet --version
3.1.100
Test run for ***********************(.NETCoreApp,Version=v3.1)
Microsoft (R) Test Execution Command Line Tool Version 16.3.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...

A total of 1 test files matched the specified pattern.                      <---- hangs forever

-nodereuse:false doesn't work either.

This the exact spot where my test run also hangs.

One interesting point I noticed is that the same code without changes runs in Visual Studio for Mac (the very first test method runs ok, and the others take 2+ minutes each - pure unit tests with mocking - so not even hitting the db). Not sure if that can help solve the issue.
dotnet version: 3.1.201

@srihere17
Copy link

srihere17 commented Apr 15, 2020

I too was facing the same issue where tests hang when executed through Visual Studio or through command line. In my case, the issue turned out to be happening because of a Nuget package I was using to generate Fake objects - AutoBogus . The library has an open issue for the same problem here AutoBogus Issue. When I switched the package to use Bogus the library which AutoBogus uses, it fixed the issue. My tests are running smoothly now. Sharing this here, just in case if it helps any one.

@rohit21agrawal
Copy link
Contributor

rohit21agrawal commented Jul 9, 2020

I am still seeing this behavior on Azure DevOps with .NET Core 3.1.301 on a Windows Hosted Agent with VS2019

running dotnet test within a powershell script causes it to hang at the end of the execution. willing to share build and repro details.

Eg: Get's stuck here for 40+ minutes

image

CC: @rainersigwald @peterhuene @livarcocc

@StanislavChankovSellerCloud
Copy link

StanislavChankovSellerCloud commented Sep 30, 2020

Same issue on Windows Server 2019 Standard
dotnet version of 5.0.100-preview.8.20417.9

image

Passing -nodereuse:false to dotnet test does not solves this issue.

@bencef
Copy link

bencef commented Oct 15, 2020

I think I might have found the culprit. dotnet test expects to be run in a terminal window

When run in an emacs shell

[bence@x250:/tmp]$ dotnet new xunit --name XUnitTiming
The template "xUnit Test Project" was created successfully.

Processing post-creation actions...
Running 'dotnet restore' on XUnitTiming/XUnitTiming.csproj...
  Restore completed in 463.81 ms for /tmp/XUnitTiming/XUnitTiming.csproj.

Restore succeeded.


[bence@x250:/tmp]$ echo $TERM
dumb

[bence@x250:/tmp]$ time dotnet test XUnitTiming/XUnitTiming.csproj 
Test run for /tmp/XUnitTiming/bin/Debug/netcoreapp3.1/XUnitTiming.dll(.NETCoreApp,Version=v3.1)
Microsoft (R) Test Execution Command Line Tool Version 16.3.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...

A total of 1 test files matched the specified pattern.

Test Run Successful.
Total tests: 1
     Passed: 1
 Total time: 1.2366 Seconds

real	0m23.992s
user	0m3.246s
sys	0m0.379s

[bence@x250:/tmp]$ 

When run in xterm:

[bence@x250:/tmp]$ echo $TERM
xterm

[bence@x250:/tmp]$ time dotnet test XUnitTiming/XUnitTiming.csproj 
Test run for /tmp/XUnitTiming/bin/Debug/netcoreapp3.1/XUnitTiming.dll(.NETCoreApp,Version=v3.1)
Microsoft (R) Test Execution Command Line Tool Version 16.3.0
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...

A total of 1 test files matched the specified pattern.
                                                                                                                                                                                                                                                                                                                              
Test Run Successful.
Total tests: 1
     Passed: 1
 Total time: 1.2068 Seconds

real    0m2.699s
user    0m2.961s
sys     0m0.318s

[bence@x250:/tmp]$ 

That 20 seconds in the first case is spent after the line
A total of 1 test files matched the specified pattern.
Where I assume it is expecting to read something from the terminal

@Leonardo18
Copy link

Hey folks, i'm facing this issue too, i have a project with BDD tests using specflow and mocking external api's using wiremock.net, when the pipeline run the line A total of 1 test files matched the specified pattern. stay stucked for 16 minutes, and it's just 5 tests scenarios.

russcam added a commit to russcam/apm-agent-dotnet that referenced this issue Jan 7, 2021
russcam added a commit to elastic/apm-agent-dotnet that referenced this issue Jan 14, 2021
This commit separates out the common test components
from Elastic.Apm.Tests into a new assembly,
Elastic.Apm.Tests.Utilities. This allows assemblies containing tests
to be run in parallel, which is not possible when test assemblies
reference another test assembly that may potentially be running
at the same time.

Update linux CI scripts to run tests by targeting the
solution file. In conjunction with the xunit.runner.json configuration,
this allows test assemblies to run in parallel.

Add coverlet and JunitXml.TestLogger packages to
all test projects using Directory.Build.Props in tests directory. 
This removes the need to add them in CI scripts or to each project individually.
Update packages to newer versions.

- Run tests with Release configuration
- Include only Elastic.Apm.* and exclude all Elastic.Apm test assemblies from code coverage
- Rename Elastic.Apm.DockerTests to Elastic.Apm.Docker.Tests for consistency
- Rename Elastic.Apm.PerfTests to Elastic.Apm.Benchmarks

- Use TestAgentComponents in test

  Update tests to use TestAgentComponents where possible, to
  mitigate intermittent failures with using AgentComponents 
  related to reading environment variables that may be set by other concurrently
  running tests

- Check token cancellation in Workloop

  check token cancellation in the BackendCommComponentBase Workloop, 
  and break if cancellation is requested.

- Don't Use SourceLink in coverlet.settings

  Jenkins cannot display Sourcelinked source code

- Fix hanging tests

  Remove netcoreapp2.2 from tests. netcoreapp2.2 consistently hangs in
  CI on Linux due to MSBuild worker node reuse by dotnet test. This is
  the issue outlined in dotnet/sdk#9452 (comment).

  Setting `nodereuse:false` when running linux dotnet test on netcoreapp2.2 fixes the
  tests hanging, but since netcoreapp2.2 is EOL by Microsoft on December 23, 2019
  (https://dotnet.microsoft.com/platform/support/policy/dotnet-core), netcoreapp2.2
  has been removed from tests in line with the policy that we should only support
  versions that are supported by Microsoft.
@nielswitte
Copy link

I am using Xunit and was able to fix this issue by adding the following to csproj:

<PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.0.0" />

Thank you.
I was also using Xunit, referencing Microsoft.NET.Test.Sdk (latest 17.1.0) in every test project solved it for me.

@koepalex
Copy link

koepalex commented Aug 3, 2022

I've the same issue with WSL2 (ubuntu) on Windows 11 my test is using XUnit (2.4.2) and .NET 6 (6.0.301).

Updating to <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.2.0" /> and adding [assembly: CollectionBehavior(DisableTestParallelization = true)] didn't solve the issue for me. I also tested versions 17.0.0and 17.1.0 of Microsoft.NET.Test.Sdk with the same result. :-(

@Kuinox
Copy link

Kuinox commented Aug 10, 2022

Hi, I think lot of these issues are due to a bug in the Process type, I opened an issue about it here: dotnet/runtime#51277

In summary:

The issue is that WaitForExit() will synchronously wait for the stderr/stdout to complete, the summoned process has exited, but it has childs process still running (the nodereuse processes in msbuild case).
In this case, it looks like that the stderr/stdout pipes wont close.

It looks like most of these issues happens when the dotnet sdk is summoned from another dotnet process, so with the Process class.

@GeorgDangl
Copy link

It looks like most of these issues happens when the dotnet sdk is summoned from another dotnet process, so with the Process class.

This makes a lot of sense, since we're usually running our CI pipelines via NUKE, which is a .NET CLI app that calls the dotnet process.

@Piedone
Copy link
Member

Piedone commented Nov 16, 2022

@bencef if you stilly happen to have this, check out microsoft/vstest#2080 (comment). Input redirection might solve your problem.

@ana-cozma
Copy link

Had the same on .Net 5 when running the tests in GitHub Actions for sonarscanner. Here is what I changed in the steps to get it working:

      - name: Setup dotnet
        uses: actions/setup-dotnet@v1
        with:
          dotnet-version: '5.0.x'

      - name: Install dependencies test
        run: dotnet restore MyProject.sln -v q

      - name: Build
        run: dotnet build MyProject.sln --no-restore -c Release -v q

      - name: Test
        run: |
          dotnet test MyProject.sln --no-build -c Release -v q \
          --settings testsettings.runsettings --logger trx \
          /p:CollectCoverage=true /p:CoverletOutputFormat=opencover

Maybe it helps someone in the same situation.

@Piedone
Copy link
Member

Piedone commented Dec 1, 2022

Can you please explain @ana-cozma what exactly did you change? The steps you included seem pretty standard to me, unless you have something special in the runsettings file.

@Evengard
Copy link

Evengard commented Feb 16, 2023

Seems like passing -v:quiet to dotnet test (not needed for dotnet build, only for dotnet test - at least in my experience) alleviates the issue somehow... No ideas how, but well, it works. I guess the solution above was the same - just setting the verbosity to quiet.

Also from here seems like running the binary directly inside a container (eg as a dockerfile entrypoint, or in my case directly with kubectl exec inside a running kubernetes pod), not wrapping it into a bash call, seems to invoke the issue somehow, whereas wrapping it into a bash call seems to alleviate it. Maybe bash somehow helps with theese stdin/stdout streams deadlocks?

@Evengard
Copy link

Well, none of the workarounds actually worked for me. Seemed sometimes like it worked, but on one of the next runs it just hung again. Honestly, I'm at a loss.

@diegosasw
Copy link

Do, by any chance, any of the tests hanging use TestServer (Mvc.Testing library) with IHostedService involved?

Mine do.

@Evengard
Copy link

Not in my case, no.

@angularsen
Copy link

@Evengard Try the --blame and --blame-hang flags to help pinpoint the hanging tests.
It creates a memory dump on hang timeout that helped me find out that it was one of my hosted services that was still alive.

https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-test

Filter flag is optional, but useful to limit the tests you want to run.

dotnet test "$test_project" \
  --no-build \
  --filter "$filter" \
  --logger trx \
  --logger "console;verbosity=detailed" \
  --verbosity "normal" \
  --results-directory TestResults \
  --blame \
  --blame-hang \
  --blame-hang-timeout 1min \
  --diag TestResults/vstest_diagnostics.log

@Evengard
Copy link

The fact is that after 15 minutes all tests are completed successfully! It's not tests which are hanging, I see all the reports of all tests actually executing successfully. It's that after that dotnet test just hangs before terminating.

@angularsen
Copy link

I see, that is probably a different case then. I would still give --blame-hang a try and see if any of the tests time out or take longer than expected, maybe the dump reveals something.

If not, the next I would try is method of elimination to find the smallest subset of tests that consistently provoke this behavior.

@Evengard
Copy link

Evengard commented Sep 6, 2023

--blame-hang never actually helped. It acted like it never even hanged. Without any output, nothing.

But I think I found a workaround for now: #27106 (comment)

@brutaldev
Copy link

Still having this issue locally and in Azure Pipelines / VSTS.

In my case it hangs for 20+ minutes without any feedback from the console.

image

Starting: Run Tests
==============================================================================
Task         : .NET Core
Description  : Build, test, package, or publish a dotnet application, or run a custom dotnet command
Version      : 2.221.0
Author       : Microsoft Corporation
Help         : https://docs.microsoft.com/azure/devops/pipelines/tasks/build/dotnet-core-cli
==============================================================================
C:\Windows\system32\chcp.com 65001
Active code page: 65001
Info: .NET Core SDK/runtime 2.2 and 3.0 are now End of Life(EOL) and have been removed from all hosted agents. If you're using these SDK/runtimes on hosted agents, kindly upgrade to newer versions which are not EOL, or else use UseDotNet task to install the required version.
C:\hostedtoolcache\windows\dotnet\dotnet.exe test D:\a\1\s\src\MyProject.Tests\MyProject.Tests.csproj --logger trx --results-directory D:\a\_temp --no-build --configuration NuGet /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /nodeReuse:False /interactive:False
Test run for D:\a\1\s\src\MyProject.Tests\bin\NuGet\net7.0\MyProject.Tests.dll (.NETCoreApp,Version=v7.0)
Microsoft (R) Test Execution Command Line Tool Version 17.7.1 (x64)
Copyright (c) Microsoft Corporation.  All rights reserved.

Starting test execution, please wait...
A total of 1 test files matched the specified pattern.
Results File: D:\a\_temp\VssAdministrator_fv-az842-382_2023-09-25_20_00_54.trx

Passed!  - Failed:     0, Passed:    47, Skipped:     0, Total:    47, Duration: 77 ms - DevEnterprise.Foundation.Common.Tests.dll (net7.0)

It takes the full execution time BEFORE Starting test execution, please wait... is logged.
After that the tests all run in under a second.

image

This happens every single time and can take 35+ minutes even when trying to run them locally.

Using all the latest packages in the test project as well:

    <PackageReference Include="coverlet.collector" Version="6.0.0">
      <PrivateAssets>all</PrivateAssets>
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
    </PackageReference>
    <PackageReference Include="coverlet.msbuild" Version="6.0.0">
      <PrivateAssets>all</PrivateAssets>
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
    </PackageReference>
    <PackageReference Include="Microsoft.NET.Test.Sdk" Version="17.7.2" />
    <PackageReference Include="NSubstitute" Version="5.0.0" />
    <PackageReference Include="NSubstitute.Analyzers.CSharp" Version="1.0.16" />
    <PackageReference Include="Shouldly" Version="4.2.1" />
    <PackageReference Include="xunit" Version="2.5.1" />
    <PackageReference Include="xunit.runner.visualstudio" Version="2.5.1">
      <IncludeAssets>runtime; build; native; contentfiles; analyzers; buildtransitive</IncludeAssets>
      <PrivateAssets>all</PrivateAssets>
    </PackageReference>

Adding AssemblyInfo to disable parallel execution as well as xUnit JSON config has no effect. As you can see from the command log, using /nodeReuse:False also has no effect.

When using --verbosity d it hangs after the following output:

 Task "Coverlet.MSbuild.Tasks.InstrumentationTask"
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Microsoft.Extensions.DependencyIn
   jection.Abstractions, Version=6.0.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 (location: C:\Users\we
   rne\.nuget\packages\coverlet.msbuild\6.0.0\build\Microsoft.Extensions.DependencyInjection.Abstractions.dll, MV
   ID: f4882a68-9800-4969-a462-28173cdd3e2e, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Microsoft.Extensions.DependencyIn
   jection, Version=6.0.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 (location: C:\Users\winuser\.nuget\pa
   ckages\coverlet.msbuild\6.0.0\build\Microsoft.Extensions.DependencyInjection.dll, MVID: f956b191-99b1-44ae-854
   8-1107f0e7ece3, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Microsoft.Bcl.AsyncInterfaces, Ve
   rsion=6.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51 (location: C:\Users\winuser\.nuget\packages\cove
   rlet.msbuild\6.0.0\build\Microsoft.Bcl.AsyncInterfaces.dll, MVID: 2771814f-5561-420a-99bc-c546ef09ba9f, AppDom
   ain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): System.Threading.Tasks.Extensions
   , Version=4.2.0.1, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51 (location: C:\Users\winuser\.nuget\packages\
   coverlet.msbuild\6.0.0\build\System.Threading.Tasks.Extensions.dll, MVID: 619062a8-972f-4ae5-bbee-e36ac541d14f
   , AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): System.Linq.Expressions, Version=
   7.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a (location: C:\Program Files\dotnet\shared\Microsoft.
   NETCore.App\7.0.11\System.Linq.Expressions.dll, MVID: 617257a8-0520-4194-abfc-ba8e7b29f2c4, AppDomain: [Defaul
   t])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Anonymously Hosted DynamicMethods
    Assembly, Version=0.0.0.0, Culture=neutral, PublicKeyToken=null (location: , MVID: 256f4137-34dc-4c2b-98a8-28
   080b1ffe33, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Microsoft.Extensions.FileSystemGl
   obbing, Version=2.0.1.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 (location: C:\Users\winuser\.nuget\pac
   kages\coverlet.msbuild\6.0.0\build\Microsoft.Extensions.FileSystemGlobbing.dll, MVID: 9a9bd20b-9f5c-459f-8747-
   9d59f0b447f8, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): System.Reflection.Metadata, Versi
   on=1.4.2.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a (location: C:\Users\winuser\.nuget\packages\coverle
   t.msbuild\6.0.0\build\System.Reflection.Metadata.dll, MVID: 5b50abaa-1e92-4e89-9098-ac96a997a34e, AppDomain: [
   Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): System.Collections.Immutable, Ver
   sion=1.2.2.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a (location: C:\Users\winuser\.nuget\packages\cover
   let.msbuild\6.0.0\build\System.Collections.Immutable.dll, MVID: 1e3eb4f5-6a60-404f-9858-d74cba72b4e8, AppDomai
   n: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Mono.Cecil, Version=0.11.5.0, Cul
   ture=neutral, PublicKeyToken=50cebf1cceb9d05e (location: C:\Users\winuser\.nuget\packages\coverlet.msbuild\6.0.0
   \build\Mono.Cecil.dll, MVID: 6f6cad7a-4114-4167-8411-53769a3850ff, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Microsoft.Extensions.DependencyMo
   del, Version=2.1.0.0, Culture=neutral, PublicKeyToken=adb9793829ddae60 (location: C:\Users\winuser\.nuget\packag
   es\coverlet.msbuild\6.0.0\build\Microsoft.Extensions.DependencyModel.dll, MVID: f7a552b5-a108-422d-a636-5ccd9f
   d18b0a, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): netstandard, Version=2.1.0.0, Cul
   ture=neutral, PublicKeyToken=cc7b13ffcd2ddd51 (location: C:\Program Files\dotnet\shared\Microsoft.NETCore.App\
   7.0.11\netstandard.dll, MVID: 40599bf0-8c68-413b-8f28-ac82e8e71774, AppDomain: [Default])
   Assembly loaded during TaskRun (Coverlet.MSbuild.Tasks.InstrumentationTask): Mono.Cecil.Rocks, Version=0.11.5.
   0, Culture=neutral, PublicKeyToken=50cebf1cceb9d05e (location: C:\Users\winuser\.nuget\packages\coverlet.msbuild
   \6.0.0\build\Mono.Cecil.Rocks.dll, MVID: 22ba7d2e-d3af-4f65-8069-a088d8ae6e74, AppDomain: [Default])

Anything else I can provide or test?

@animaonline
Copy link

Any updates?

@brutaldev
Copy link

brutaldev commented Oct 27, 2023

I abandoned Coverlet (I also tried a few others and they all take forever to instrument) and settled on just using Microsoft.CodeCoverage which is slightly more complex to configure but runs perfectly. Made small modifications to Microsoft's sample runsettings file to just include my assembly pattern like this:

...
<ModulePaths>
  <Include>
    <ModulePath>.*\\DevEnterprise\.Stuff\..*</ModulePath>  <!-- Add patterns to include your code and test projects here -->
  </Include>
  <Exclude>
    <ModulePath>.*CPPUnitTestFramework.*</ModulePath>
  </Exclude>
</ModulePaths>
...

And also default the format to Cobertura so the rest of the pipeline tools still work as they did with Coverlet.

...
<DataCollectionRunSettings>
  <DataCollectors>
    <DataCollector friendlyName="Code Coverage" uri="datacollector://Microsoft/CodeCoverage/2.0" assemblyQualifiedName="Microsoft.VisualStudio.Coverage.DynamicCoverageDataCollector, Microsoft.VisualStudio.TraceCollector, Version=11.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a">
      <Configuration>
        <Format>Cobertura</Format>  <!-- Add this to output as Cobertura -->
        <CodeCoverage>
...

Finally, add the runsettings file to your test csproj file like this:

...
<PropertyGroup>
  <RunSettingsFilePath>$(MSBuildThisFileDirectory)tests.runsettings</RunSettingsFilePath>
</PropertyGroup>
...

image

The results speak for themselves.

@uzair08inator
Copy link

@Evengard Try the --blame and --blame-hang flags to help pinpoint the hanging tests. It creates a memory dump on hang timeout that helped me find out that it was one of my hosted services that was still alive.

https://learn.microsoft.com/en-us/dotnet/core/tools/dotnet-test

Filter flag is optional, but useful to limit the tests you want to run.

dotnet test "$test_project" \
  --no-build \
  --filter "$filter" \
  --logger trx \
  --logger "console;verbosity=detailed" \
  --verbosity "normal" \
  --results-directory TestResults \
  --blame \
  --blame-hang \
  --blame-hang-timeout 1min \
  --diag TestResults/vstest_diagnostics.log

this helped me. SQL provider mock was missing in my test...it got stuck forever

@Evengard
Copy link

Just a heads up, the problem went away for me completely when I migrated to dotnet 8

@Piedone
Copy link
Member

Piedone commented Jul 17, 2024

A glimmer of hope for those still stuck on this:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests