De pagina die u heeft aangevraagd, is momenteel niet beschikbaar in uw taal. U kunt elke webpagina onmiddellijk vertalen in een door u gekozen taal via de ingebouwde vertaalfunctie van Google Chrome.

How to migrate Elastifile EMS boot device to persistent ssd

When using the Elastifile Cloud File System advanced features, like - AsyncDR or Snapshot Backup, the load on EMS pd-standard may reach the pd i/o limit.

To solve the issue it is recommended to move the EMS instance to use PD-SSD.

When instructed, this procedure need to be performed online with Elastifile support.

First we need to populate all variables used during this KB. -

export EMS_INSTANCE_NAME=""
export GCP_PROJECT_ID=""
export GCP_REGION=""
export GCP_ZONE=""
export SNAPSHOT_NAME=$EMS_INSTANCE_NAME"-snapshot"
export DISK_NAME=$EMS_INSTANCE_NAME
EMS_INSTANCE_NAME = Current EMS Instance name in GCP console
GCP_PROJECT_ID = GCP project ID used by the current EMS instance
GCP_REGION = GCP region used by current EMS instance
GCP_ZONE = GCP zone used by current EMS
EMS_INSTANCE_NAME-snapshot = Snapshot name for the current EMS boot disk.
DISK_NAME = Current EMS boot device name, by default same as EMS instance name.

Stop current EMS instance -  

From this step till procedure is done, EMS UI & CLI will not be accessible, NFS i/o will continue as usual, without interruption.
gcloud compute instances stop $EMS_INSTANCE_NAME --project $GCP_PROJECT_ID --zone $GCP_ZONE

Snapshot current EMS disk -

gcloud compute disks snapshot $EMS_INSTANCE_NAME --storage-location $GCP_REGION --project $GCP_PROJECT_ID --zone $GCP_ZONE --snapshot-names $SNAPSHOT_NAME

Detach current EMS hdd boot device - 

gcloud compute instances detach-disk $EMS_INSTANCE_NAME --disk=$DISK_NAME --zone $GCP_ZONE --project $GCP_PROJECT_ID

Delete current EMS hdd boot disk -

Don't progress to this step until all previous steps finished successfully.

gcloud compute disks delete $DISK_NAME --zone $GCP_ZONE --project $GCP_PROJECT_ID
 

Create new EMS boot device based on pd-ssd -

gcloud compute disks create $DISK_NAME --size 100Gb --source-snapshot $SNAPSHOT_NAME --type=pd-ssd --zone $GCP_ZONE --project $GCP_PROJECT_ID

Attach new ssd boot device to EMS -

gcloud compute instances attach-disk $EMS_INSTANCE_NAME --disk=$DISK_NAME --boot --zone $GCP_ZONE --project $GCP_PROJECT_ID 

gcloud compute instances set-disk-auto-delete $EMS_INSTANCE_NAME --auto-delete --disk=$DISK_NAME --zone=$GCP_ZONE

Start EMS - 

gcloud compute instances start $EMS_INSTANCE_NAME --project $GCP_PROJECT_ID --zone $GCP_ZONE

 

New EMS is ready, connect to the EMS UI CLI, and check is functioning correctly.

  • UI - main dashboard work.
  • UI - check standard flows used daya by day.
  • CLI run - 
    source /root/elfs_admin;elfs-cli system show --id 1
  • CLI run - 
    ecs-cli system
If all the commands above, return ok and the UI is functioning correctly, then the migration to ssd boot device finished successfully.

 

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Main menu
5214969099224802309
true
Search Help Center
true
true
true
false
false