It’s all about IT world, support, technical expertise, how you do, how you perform a task, It’s all about it, I update all the information which I’ve done till now, irrespective at what time, All these steps and troubleshooting performed on live Infrastructure, so downtime and small minute things have been tested and then implemented, Of course, every Infrastructure has its own protocol, So would suggest you to check them first.
Tuesday, January 22, 2013
Robocopy Command
Hi People
When it comes to copy data from source to destination, we have several commands and utilities.
Like xcopy, robocopy, copy and etc.
One of these command is robocopy,
Command Applies To: Windows 7, Windows Server 2008, Windows Server 2008 R2
And it works like any thing, if the data size is less than 100 GB, I would suggest this command
Command options are
/Z : Copy files in restartable mode (survive network glitch).
/R:n : Number of Retries on failed copies - default is 1 million.
/W:n : Wait time between retries - default is 30 seconds.
/COPYALL : Copy ALL file info (equivalent to /COPY:DATSOU).
/E : Copy Subfolders, including Empty Subfolders.
/LOG:file : Output status to LOG file (overwrite existing log).
/FP : Include Full Pathname of files in the output.
/TEE : Output to console window, as well as the log file.
/V : Produce Verbose output log, showing skipped files.
/ETA : Show Estimated Time of Arrival of copied files
Simple command syntax would be with example would be :-
I'll suggest any thing more than 1 TB should go to SAN team, as they have their tools, which can copy the data of such size End of the day data has to be copied in much effective way
I'll suggest any thing more than 1 TB should go to SAN team, as they have their tools, which can copy the data of such size End of the day data has to be copied in much effective way
Hi, thanks for information.
ReplyDeleteBut, what about if data size is bigger than 100gb. Example 1TB.. What do you suggest?
Thanks so much.
I'll suggest any thing more than 1 TB should go to SAN team, as they have their tools, which can copy the data of such size
DeleteEnd of the day data has to be copied in much effective way
I'll suggest any thing more than 1 TB should go to SAN team, as they have their tools, which can copy the data of such size
ReplyDeleteEnd of the day data has to be copied in much effective way
Saver address ldap
ReplyDeleteSaver address ldap
ReplyDelete