Magento Community Fork Composer architecture

Alan Storm in his article ( about Magento Fork challenges mentioned :

Magento 2 is intended to be installed and run via composer, but Magento doesn’t publish packages to the public packagist repository. Instead, they have a third party repository named where individual packages are published. This repository is not public — it requires a composer username and password that’s tied to your Magento Inc. account.

Rumor has it that Private Packagist powers this repository, but the code that publishes the individual packages from Magento’s main project repository is not, to my knowledge, shared publicly or open source. I’ve long thought it might be interesting to build a tool that looks at the Magento code in their GitHub repository and confirms it’s the same code that’s been published to the private composer repository.

From a technical point of view, forking Magento 2 probably means recreating this infrastructure. Perhaps, on a long enough timeline, it could even mean simplifying it. Does Magento Best practices is a best practices.

It is my thoughts about composer in the Magento community fork.

There are 2 solutions:

1-st don’t use multi composer packages. At least from the start. We can install Magento using single composer repo. And disable modules in the app/config.

Or we can install Magento in the app/code/Magento

I don’t think what Magento has now is the best way to do stuff.

Leets check how modern eCommerce solution made in Germany works:

Yes, it will require a little bit of migration effort.

git clone new/repo magento3-new
cp -r magento2-old/app/* magento3/app/*

All migration/upgrade processes are really easy to automate and add to the repo.

or Magento mono composer:

composer create-project magento3/composer-project magento3 --no-interaction --stability=dev

with the installation script in the composer.json

curl -L “" > && unzip -n && rm

Simple is better. That's why Magento 2 was so messed. It was really over-engineered especially when AEM, Sensei, and ADORE Microservices will be added.

As a Fallback Magento composer approach can be reimplemented.

Second solution:

Implement a release process that is grabbing a single module from the fork repository, create a ZIP file from it, put that ZIP together with its needed composer.json metadata into S3, and then add release to custom Satis repository ( Requires to add additional s3 repository to the composer.

Artifact Repository and Multiple package versions in a package repository can be used.

This approach is described here:

This docker image builds a composer repository using satis, and uploads it to an AWS S3 bucket.

we can use S3 as a Composer Repo:

Example of the Magento/Repo composer.json:

"name": "magento/framework",
"version": "102.0.4-p2",
"dist": {
"type": "zip","url": "",
"shasum": "d497cb6a1d2db953e97ee249198b28e9579b9601"

and here is a monorepo GitHub code :

There is a description of the solution if to use private Packagist:

Multi packages were first added to Private Packagist with Security Monitoring in July, 2020. You need to have a composer.json file in each directory which should be considered a package of its own.

However, this requires maintenance of the private packages. To reuse existed The package requires a single repo per module. What the issue we can easily split Magento fork monorepo into single repos during the build/release phase and use them as a Packagist source.

Release script draft:

gh repo create technicalrepo
git remote add technicalrepo
git checkout branch-for-module-1
cp -r app/code/Magento/module-1 /tmp/
git rm -rf .
cp -r /tmp/* .
git add .*
git commit --allow-empty -m "root commit"
git push technicalrepo branch-for-module-1:master
gh release create $VERSION

gh is GitHub on the command line. It brings pull requests, issues, and other GitHub concepts to the terminal next to where you are already working with git and your code.

Add module the Packagist using technical repo.

Anton’s Krill idea

It’s still not clear if there is a plan/desire to create a fork. But #Magento has a well-tested built-in forking mechanism that’s better than forking: modules. Composer in M2 adds modularity to its framework. A set of community modules/packages would be way easier to maintain.

To maintain such a set of components you would still need a ci/cd pipeline, contribution process, guidelines, documentation, testing infrastructure, release process, support, etc. But merging, testing, troubleshooting, and support would be much easier.

Why couldn’t a well-maintained and battle-tested set of modules that extend Magento and replace some core modules be such an alternative? You can always fork if really needed. Git saves history :) But I don’t think it will ever be needed.

There is a reason why composer exists. There is a reason why extension developers use composer packages and magento modules instead of git branches to distribute modules. There is a reason agencies consider editing core code evil.

Igor Minyaylo

we packed all MSI(in our case Fork) components into a separate composer meta-package and referred all of them using “^”…

Which allows non-breaking updates coming with composer update.

Also, we can use composer replace/patches instead of the full module files if we need to fix just one/few files.

Magento/APP Cloud Architect. Melting metal server infrastructure into cloud solutions.