Any update about this issue??
Any news on this?
Any update on this? Really need this feature
Any news on this? Just spent half a day trying out to built an automated workflow for this and found out that you need to authenticate!
It’s really bad that it is not transparently mentioned and that it works this way !
Any news on this, please?
This is crazy. Welcome to Microsoft!
I’d also like to ask for a status on this. Right now, pulling public packages requires creating an access token that has full control of all repos you have access to (I tried limiting the scope to public_repo, but alas, that’s not enough). Worse, if the public package is in an org with SSO, you need to enable SSO access for the token as well, allowing the token access to every private repo in that org too. Access tokens bypass 2FA, so this is a pretty awful security issue with using public package repositories that make them unusable outside of github actions.
Also, another request here is to make it possible to pull an image with only a read-level scope on the token scope for repo, rather than needing full-access.
Thank you @ryan-lane for your feedback! While I do not have a timeline to share for this, I’ve added your insight as well to the open internal feature request.
Any updates ???
I am not sure if I should be glad to know I am not alone…
I was really hoping to use this feature as a way of adding an additional layer of convenience for trying out my software, but to my dismay, when I tried to use it myself I was unable to pull the image due to a lack of authentication. I was planning on using it for testing since I need a linux environment and it does not build on Mac OS X, but was halted in my tracks with the message “no basic auth credentials”.
My disappointment is immeasurable, and my day has been ruined
This needs to absolutely be fixed. In the meanwhile, I will use plain docker hub.
Please change this
This definitively needs a change. I hoped using github instead of dockerhub would allow me to use better integration. Now I see that I cannot even download later the created files. That is multiples hours of testing to get everything up and running to see it non-functional in the ending as the packages are only publicly available using the GUI not using the docker CLI. This is not good.
Adding a following note to my docker-compose files is an acceptable work around (for now)
# # NOTE: Since I host sources on github I would prefer to use github packages to host docker images, but as of # March 2, 2020 github still requires auth to pull from public repos. Screw that. For now, I mirror build to dockerhub. # # https://github.community/t5/GitHub-Actions/docker-pull-from-public-GitHub-Package-Registry-fail-with-quot/m-p/34983#M2067 #
The URL refers to this thread. Then, I usually have an image section like this with github package commented out waiting for the day, month, a year or maybe next century when they fix this:
services: frontend: #image: docker.pkg.github.com/xxxx/frontend:latest # see NOTE image: docker.io/xxxx/frontend:latest backend: #image: docker.pkg.github.com/xxxx/backend:latest # see NOTE image: docker.io/xxxx/backend:latest
In my github action I simply add one more job to tag and push to dockerhub. Come to think of it, I kind of like the extra repo at hand. In dockerhub I keep the latest *and* I can tag my old releases, while in github packages I just keep the latest and in AWS I also only keep the latest.
This is a crazy restriction, I can’t see this feature gaining any sort of adoption in the current state.
This also needs to be clearly documented, I wasted time thinking I was doing something wrong, “It can’t require authenication for *PUBLIC* packages, the docs don’t say anything about that”. Until I found this thread
Any news on this matter, considering the fact NPM was just bought by Microsoft (Github)? Auth for public packages is an absolute no-go and must be resolved asap.