Copying MEG data
Operators must have access to a BlueBEAR RDS Project and their own BlueBEAR Linux Account prior to transferring data from the MEG Acquisition computer ("Sinuhe").

Data is stored here - in the BEAR Cave!
Requesting RDS / Activating a Linux Account
RDS can be requested through the IT Service Desk
- After Log in, select the option "Research", then "Request a new BEAR Project".
RDS can only be requested by a member of Staff/PI/Supervisor.
The Project MUST have BlueBEAR compute services added ("BlueBEAR Linux HPC").
A BlueBEAR Linux Account is needed to be able to copy off Acquired data from the Console. To request a BlueBEAR Linux Account, after Log in ...
- Select the option "Research", then select "My BEAR Accounts" and then "Activate your BEAR Linux Account (you must be on a suitable project)".
Association is first required with a Project, before data can be copied off the Console.
Addition of MEG Operator email address to the Project needs to be requested - by the MEG Operator or PI.
Transferring Acquired data to RDS
Use the scp command from a terminal window on the MEG console. To open ...
- Click the Application Launcher button (bottom left) and select
Terminal. - Right-click anywhere on the Desktop and select
Konsolefrom the window that pops up.
Change to relevant Project directory in neuro-data ...
cd /data/neuro-data/project-nameor more likelycd /data/neuro-data/project-name/subject-name
subject-name will be the anonymised code generated by the Participant Logging Computer (PLC)
or possibly no-name if just Acquiring data for Project Development.
If a subject-name is not provided when setting up megacq, a sub-directory called no_name is generated instead in the project-name directory.
-
When files are saved in the subject-name directory, a sub-directory with the format YYMMDD is created.
-
This YYMMDD sub-directory will contain the saved MEG acquisition.FIF data file.
-
If another Acquisition is started, any subsequently saved data will be in the same YYMMDD directory.
Run the scp command to copy over the data ...
scp -rp data username@bluebear.bham.ac.uk:/rds/projects/year/project-nameorscp -rp data username@bluebear.bham.ac.uk:/rds/projects/letter/project-name- Replace
datawith the folder to be copied, or use the asterisk character, *, for the complete contents of neuro-dataproject-nameorproject-name/subject-name. - Replace
usernamewith Operator's username. - Replace
project-namewith Operator's RDS path.
- Replace
For example ...
cd /data/neuro-data/empty_roomorcd /data/neuro-data/empty_room/no_name
thenscp -rp 191225 smithj25@bluebear.bham.ac.uk:/rds/projects/2018/jenseno-meg-qualitycontrol/smithj25/meg-data/
or perhapsscp -rp * smithj25@bluebear.bham.ac.uk:/rds/projects/j/jenseno-meg-raw-data/data/neuro-data/empty-room/
Check data has copied correctly ...
ssh smithj25@bluebear.bham.ac.uk
thencd /rds/projects/2018/jenseno-meg-qualitycontrol/smithj25/meg-data/orcd /rds/projects/j/jenseno-meg-raw-data/data/neuro-data/empty-room/- Run the command
ls -lto list the data to allow checking.
NOTE: Please don't save data to anywhere other than the data/neuro-data/project-name directory on the MEG console.
BlueBEAR Data Backup
For Duty of Care obligations, the following information is backed-up, overnight, to a RDS Project space on BlueBEAR.
- Daily acquired MEG data
- Project & Subject databases, including the ctc and sss folders
- The DACQ Tuning directory
A backup script runs as a CRON job, generating backup-log and backup-error txt messages emailed as necessary.
MEG Operators need to ensure Acquired data is always kept in the correct folder on the Console ...
/data/neuro-data/project-name
If data doesn't get copied overnight, due to errors with BlueBEAR, the next time the script runs, and BlueBEAR is available, any missing files are backed up.
Acquired data is then routinely removed from the DACQ data partition, on more-or-less a monthly basis, to allow continued MEG Acquisition.
No quota system is in place, or automatic file deletion deamon running, but MEG Operators will still need to copy off their data to their own RDS space in a timely manner.