If you have multiple projects or organization and you want to clone a machine that is already configured, you can create an image of the instance of the machine. But, you will have to share the access to the external project and organization. Here is how to do it :
Share the image to the organization in the Images and in IAM
In both IAM and Image (VM) you will have to share the access Compute Image User
Start the VM using the shared image from another project / organization
Is it possible that when you try to import a backup of your MySQL database on Google Cloud, you have the following error :
We are trying to import a backup in Google Cloud SQL
The following error appears in the logs
error: exit status 1 stdout(capped at 100k bytes): stderr: ERROR 3546 (HY000) at line 26: @@GLOBAL.GTID_PURGED cannot be changed: the added gtid set must not overlap with @@GLOBAL.GTID_EXECUTED
To solve the problem, open the SQL file with text editor and search the error “@@GLOBAL.GTID_EXECUTED”
After modifying the *.sql file, you should be able to import successfully :
It will renew all the certificates, the 1st of each month (0 0 1 * *), and will stop nginx before the renewal and start nginx after the renewal so it can works. It will work on all the linux servers.
If you have a database on Google Cloud and you want to import and export some tables to another database, you will need a database explorer to export your .SQL backup.
Export the tables you want in database
Google Cloud interface only allow to export all a database or a specific database, but not tables
As you can see you don’t have the option to select tables. To do so, you will have to connect the database with a SQL explorer. In MySQL Workbench you should go in Server > Data Export
Then you should be able to select the tables you want
select Export to Self-Contained File to have a .SQL file
Import the SQL file to Google Cloud
You will need to upload the SQL file to Google Cloud in order to import it. In order to do it, go to Google Cloud > Cloud Storage, create the bucket if it doesn’t exist yet :
create the bucket dedicated for SQL, with random numbers, to have a unique name
Upload the SQL file generated by MySQL Workbench
Then, you can go on Google Cloud > SQL > Import
Select the SQL file from the bucked and select the database you want the tables to be imported
And then you should have imported successfully the tables on the selected database. You should check it in Operations
In case of errors, check the error in Operations, and modify the .SQL file with a text editor, search the error in the SQL file and remove the lines, and repeat the process (upload it again to Google Cloud Storage and import the SQL file again)
We often need images in our React or Angular app, but putting them in the project is not clean. So we can use Google Cloud with Cloud Storage for the images on your app.
There are many advantages of using Cloud Storage to expose public images : – Real time maintenance (upload, delete, change, …) – You don’t need any commit on your git (if you commit images on your Github / Gitlab for example) – Clean project, only code files – Light project repository to commit, images are heavy quickly
Create a Bucket
First, you need to create a Bucket (it’s like a big folder) if you don’t already have one.
Switch to Uniform access control
Then, to make your Bucket public, go to the Permissions tab, and change the access control to Uniform :
You should click Switch to Uniform, so you don’t have to make public each file one by one
You can see, if you upload your first image, that it’s not public yet
Add public access using by adding a Reader role for allUsers
You can then add the access of Object Reader to allUsers
Then you can see it’s Public to Internet and you are able to copy URL
You can now use you image anywhere, it’s on internet and anyone can see it with the URL, so you can use it to store images for your React / Angular app, all images will work.
There are some differences between Google Cloud and AWS when you want to deploy your function for your API.
My point of view is that Google Cloud function may be a bit more simple to deploy but AWS is in a way the most reliable and complete cloud environnement nowadays.
On both cloud environnement you can upload the code with a ZIP, you can so use the same way to adjust your code for AWS:
We will upgrade the code with a repository that will be in a ZIP
Differences between Google Cloud functions and AWS Lambda (Node.JS)
THE DEPENDENCIES – In Google Cloud, the dependencies are automatically installed from the package.json file – In AWS, you need to install the dependencies with npm or yarn, that will build the node_modules folder If you don’t have the node_modules folder, your Lamdba function will not work with the following error:
If you want to install a package on CentOS with YUM is it possible that you might have the following error
Total download size: 20 M
Installed size: 60 M
Is this ok [y/d/N]: y
Downloading packages:
Delta RPMs disabled because /usr/bin/applydeltarpm not installed.
PACKAGE.x86 FAILED
https://rpm.nodesource.com/pub_15.x/el/7/x86_64/PACKAGE.x86_64.rpm: [Errno 14] HTTPS Error 404 - Not Found
Trying other mirror.
To address this issue please refer to the below wiki article
https://wiki.centos.org/yum-errors
If above article doesn't help to resolve this issue please use https://bugs.centos.org/.
Error downloading packages:
2:PACKAGE.x86_64: [Errno 256] No more mirrors to try.
To solve this problem we should reset the sources used by YUM, the package manager of CentOS. To do so, you should use the following command:
yum clean all
It should clean the sources of the repos. You should now be able to install the packages you want with yum i <package>
When importing SQL file on Google Cloud (MySQL 5.7), I faced the following error:
exit status 1 ERROR 1227 (42000) at line 18: Access denied; you need (at least one of) the SUPER privilege(s) for this operation
You can see the error when go in “Operation”, you can see the logs of the MySQL instance
In order to solve this problem, you should understand that we do not have SUPER privileges on the Google Cloud SQL instance, it means if you do “SHOW GRANTS” on your MySQL, it will tell you that you have almost all the rights but not all.
This problem happen when you try to import some specific things such triggers, views, … so you should be careful on the SQL export when you do it. As for me, I’m using MySQL Workbench 8.0, it seems that by default I have this error when I try to import the SQL file generated on Google Cloud. I tried to import an export from Google Cloud to another instance and seems to work, so the problem is when you export the SQL file it should not contains things that needs specific privileges. I found a configuration that worked for me and now I don’t have this annoying error anymore. Great ! Here is the options I have used. First, when you export, go to Advanced Options…
Then you should check the following options:
Now you can try to import the SQL file on Google Cloud, it should works ! F*ck yeah!