Migrate Issues, Wiki from gitlab to github
The developer of this repository has not created any items for sale yet. Need a bug fixed? Help with integration? A different license? Create a request here:
git clone https://github.com/piceaTech/node-gitlab-2-github.git
Before using this script, you must mirror your GitLab repo to your new GitHub repo. This can be done with the following steps:
# Clone the repo from GitLab using the `--mirror` option. This is like # `--bare` but also copies all refs as-is. Useful for a full backup/move. git clone --mirror [email protected]:username/repo.git
Change into newly created repo directory
Push to GitHub using the
--no-verifyoption skips any hooks.
git push --no-verify --mirror [email protected]:username/repo.git
Set push URL to the mirror location
git remote set-url --push origin [email protected]:username/repo.git
To periodically update the repo on GitHub with what you have in GitLab
git fetch -p origin git push --no-verify --mirror
After doing this, the autolinking of issues, commits, and branches will work. See Usage for next steps.
The user must be a member of the project you want to copy or else you won't see it in the first step.
cp sample_settings.ts settings.ts
npm run start
The URL under which your gitlab instance is hosted. Default is the official
Go to Settings / Access Tokens. Create a new Access Token with
read_repositoryscopes and copy that into the
Leave it null for the first run of the script. Then the script will show you which projects there are. Can be either string or number.
Where is the github instance hosted? Default is the official
Under which organisation or user will the new project be hosted
Go to Settings / Developer settings / Personal access tokens. Generate a new token with
reposcope and copy that into the
What is the name of the new repo
S3 can be used to store attachments from issues. If omitted,
has attachmentlabel will be added to GitHub issue.
AWS credentials that are used to copy attachments from GitLab into the S3 bucket.
IAM User who owns these credential must have write permissions to the bucket.
Existing bucket, with an appropriate security policy. One possible policy is to allow public access.
As default it is set to false. Doesn't fire the requests to github api and only does the work on the gitlab side to test for wonky cases before using up api-calls
If this is set to true (default) then the migration process will automatically create empty dummy issues for every 'missing' GitLab issue (if you deleted an GitLab issue for example). Those issues will be closed on Github and they ensure, that the issue ids stay the same on both, GitLab and Github.
If this is set to true (default) then the migration process will automatically create so called "replacement-issues" for every issue where the migration fails. This replacement issue will be exactly the same, but the original description will be lost. In the future, the description of the replacement issue will also contain a link to the original issue on GitLab. This way users, who still have access to the GitLab repository can still view its content. However, this is still an open task. (TODO)
It would of course be better to find the cause for migration fails, so that no replacement issues would be needed. Finding the cause together with a retry-mechanism would be optimal, and will maybe come in the future - currently the replacement-issue-mechanism helps to keep things in order.
If this is set to true (default is false) then all merge requests will be migrated as GitHub issues (rather than pull requests). This can be used to sidestep the problem where pull requests are rejected by GitHub if the feature branch no longer exists or has been merged.
This is an array (empty per default) that may contain string values. Any note/comment in any issue, that contains one or more of those string values, will be skipped (meaining not migrated). Note that this is case insensitive, therefore the string value
foowould also lead to skipping notes containing a (sub)string
time spent, since those kind of terms can be used in GitLab to track time, they are rather meaningless in Github though
changed the description,
added 1 commit,
mentioned in merge request, etc as they are interpreted as comments
Object consisting of
logis set to true, then the merge requests are logged in the specified file and not migrated. Conversely, if
logis set to false, then the merge requests are migrated to GitHub and not logged. If the source or target branches linked to the merge request have been deleted, the merge request cannot be migrated to a pull request; instead, an issue with a custom "gitlab merge request" tag is created with the full comment history of the merge request.
Maps the usernames from gitlab to github. If the assinee of the gitlab issue is equal to the one currently logged in github it will also get assigned without a usermap. The Mentions in issues will also be translated to the new github name.
When one renames the project while transfering so that the projects don't loose there links to the mentioned issues.
Because Github has a limit of 5000 Api requests per hour one has to watch out that one doesn't get over this limit. I transferred one of my project with it ~ 300 issues with ~ 200 notes. This totals to some 500 objects excluding commits which are imported through githubs importer. I never got under 3800 remaining requests (while testing it two times in one hour).
So the rule of thumb should be that one can import a repo with ~ 2500 issues without a problem.
See section 'useReplacementIssuesForCreationFails' above for more infos! One reason seems to be some error with
Octokit(error message snippet: https://pastebin.com/3VNUNYLh)
the milestone refs and issue refs do not seem to be rewritten properly at the moment. specifically, milestones show up like
%4in comments and issue refs like
#42do not remap to the
#42from gitlab under the new issue number in github. @ references are remapped properly (yay). If this is a deal breaker, a large amount of the code to do this has been written it just appears to no longer work in current form :(
A throttling mechanism could maybe help to avoid api rate limit errors. In some scenarios the ability to migrate is probably more important than the total duration of the migration process. Some users may even be willing to accept a very long duration (> 1 day if necessary?), if they can get the migration done at all, in return.
Some requests could be run in parallel, to shorten the total duration. Currently all GitLab- and Github-Api-Requests are being run sequentially.