@JakubMisek

I think the cache should be much smaller to only contain the headers and docblocks correct?

  1. My project has a vendor folder because of composer
  2. 60k PHP files
  3. 250MB in total

I just timed it, it took about 2 minutes from what I can see on the CPU on my SSD. I probably can get better info if the language server can output some stats.

I sometimes use composer's files to derive information, e.g. folder paths etc. It might help to check packge versions before deciding if a scan should take place.

    phptools all right, that's definitely more than it should be.

    We could squish it into headers+phpdoc (if they're type annotated). Anyways the scan should not take 2 minutes :/

    It's Ubuntu/x64, right?

      JakubMisek

      yes, Ubuntu 22.04.1 LTS x64. How many files and storage does your 2 second workspace use?

        phptools we're testing equivalent to two Laravel installations (13K files)

        Do you see the following in the VSCode status bar (bottom left)

        {spinning icon} Processing N ....

        or just the {spinning icon}

        ?

          JakubMisek yes, there is the Processing N, and the N is a number that changes until the parsing is complete, about 2 minutes, sometimes longer, 2 minutes is considered fastest.

          I think 13k files is small as no other packages are used. You might have to survey around for this, but to me, my 60k files project is considered "small" too.

            phptools we should definitely support 100k and more. Let me do some tests.

            Does the N increases over time and then decreases, or it mostly decreases linearly?

            If you'd help us, and maybe do a video of that status bar + some resource manager of CPU and SDD utilization ... it's possible we just doit inefficiently on some systems.

            JakubMisek

            https://youtu.be/91Xql2YRZc4

            N jumps around. This time it took 10 minutes.

            Ok, to be clear, 2 minutes (A) and 10 minutes (B) are taken differently.

            A - 8 + 8 core 5800X, VSCode in Windows 10, local SSD E:
            B - 8 core 5800X, Ubuntu in VirtualBox running on Windows 10 host, parsing the exact same files, but via VirtualBox shared folder E:

            I have measured faster times on B before. The SSD is always under 10%.

            JakubMisek I want to add further that it is a multiroot workspace, about 15 roots in total. Not sure if it is a cause of the problem. Thank you.

              phptools nice specs;

              this is definitely a synchronization issue; I've tested the same amount of files on a 4-core laptop with 16G RAM, 1.4GHz, SSD ... and it's a few seconds here.

              I'll watch it again a few times, if I see some pattern ...

              Wanted to provide this info:

              I tested a few projects from GitHub, they were processed in under 10 seconds.
              Then I modified my project to be single root, but it did not reduce the time, still too about 1-2 minutes.

                I found 2 folders inside vendor/composer that were from broken downloads, somehow composer did not clear them out. Removing those files brought # of PHP files down to 37k, but still taking over a minute to parse at its fastest

                phptools thanks, it's still way more than it should be.

                Actually, I think it's slower because your PC is faster; doing more things at once is probably causing some synchronization issues.

                  phptools one more question please

                  in Help/Toggle Developer Tools -> Console

                  Do you see something like

                  phptools: analysis pending {N} ...

                  That shouldn't be there while parsing is in progress;

                    phptools

                    We've made some "tweaks" and prepared pre-release 1.27.11988

                    I've made a quick video with just 5 workspace folders and 16k files (but I'll prepare a larger test soon); here is the whole loading: https://youtu.be/rgUEHjaDijI (it takes about 10 seconds on HDD to fully load)

                    So currently, I guess for 60k files, 30-40 seconds would be expected (and we'll be working on improving that).

                    Minutes would mean a bug.


                    Note: I'd recommend explicitly exclude "vendor" folder from being analyzed using "php.problems.exclude" setting (https://docs.devsense.com/en/vscode/problems#phpproblemsexclude). Something like the following:

                    "php.problems.exclude": {
                        "vendor": true,
                    }

                      JakubMisek

                      Thank you!

                      I have already have vendor in php.problems.exclude and I timed 1.27.11988 quite consistently at about 90 seconds (5-6 times) for my same project using B.

                      By the way, B is also running using code-server.

                      Once again, thanks!

                        @JakubMisek

                        Something else related that I realized, starting the PHP Tools XDebug while the language server (LS) is still parsing, seems to have problem.

                        I can start PHP Tools XDebug, but I think the listener doesn't start correctly, because my server can't connect back.

                        But if I start the debug after LS has finished, then things work correctly.

                        JakubMisek

                        Yes, the usual listening on port 0.0.0.0:xxxx... but nothing that indicates an error. Let me monitor further.