Design, photography and ramblings

Month: April 2015

ConQuest DICOM Cheatsheet

The latest version of ConQuest DICOM server can be found here

dgate.exe usage

Running ‘dgate.exe -h’ in your DOS command line will provide you a list of command line options to update your PACS.  I use this to manually add a folder of images to my PACS.   Say your PACS is installed under:

S:\dicomSqlServer

Then to add new folder of images to the PACS database I would use the command (with PersonX being the name of the directory with the images you want in your PACS); this command line takes advantage of the  -v (report standard output to DOS window) and [-frDEVICE,DIR] (Regen single directory DIR on DEVICE); you don’t need the -v option, but it shows you the images being imported into the PACS within the command line, so I see no reason to leave it out:

S:\dicomSqlServer\dgate64.exe -v -frMAG0,[DirectoryToAddToPACS]

NOTE: in the above command, ‘MAG0’ is the device name specified for the DICOM image (found in the image header information).
Deleting images from the PACS
If you want to delete an image from the PACS then I suggest running a simple SQL query as opposed to trying the command line methods below (since the help does not provide beginners with enough info to get it done).  So just create a delete query in SQL like so:
DELETE FROM [DICOMImages]  WHERE [SOPInstanc]='[your image SOPInsance number]';
To get a list of the command line options you have for your version of ConQuest, run the DOS command ‘dgate.exe -h’ (the -h is for help).  Below is a what is available for version 1.4.16 and may change slightly depending on your PACS version:DGATE: UCDMC/NKI DICOM server thread and PACS utility application 1.4.16Usage:
(1) DGATE <-!#|-v|-u#|-^#>    Report as in dicom.ini|stdout|UDP|File(#=port)
          [-p#|-qIP|-b]       Set port|Set target IP|run debug 1-thread mode
          [-wDIR]             Set the working directory for dgate(ini,dic,...)
          [-i|-r|-arDEVICE]   Init|Init/regenerate DB|Regen single device
          [-d|-m|-k]          List (-d) devices (-m) AE map (-k) DICOM.SQL
          [-t|-o]             Test console|Test database
          [-sOpt|-esap d u p] Create ODBC source (WIN32), database with SApw
          [-nd|-nc#|-jd|-jc#] NKI de-/compress#|JPEG de-/compress# FILE
          [-j*##|-j-##FILE]   Recompress FILE to ##
          [-as#,N|-amFROM,TO] Select#KB to archive of MAGN|move device data
          [-au|-aeFROM,TO]    Undo select for archiving|rename device
          [-av|-atDEVICE]     Verify mirror disk|Test read files for DEVICE
          [-abJUKEBOX1.2,N]   Make cacheset to burn JUKEBOX1,CD2 from MAGN
          [-acJUKEBOX1.2]     Verify JUKEBOX1,CD2 against cacheset
          [-adJUKEBOX1.2]     Verify and delete cacheset for JUKEBOX1, CD2
          [-f<p|t|s|i>ID]     Delete DB for Patient, sTudy, Series, Image
          [-f<e|d|z>file]     Enter/Delete DB of file, Zap server file
          [-faFILE<,ID>]      Add file to server<optionally change PATID>
          [-zID]              Delete (zap) patient
          [-frDEVICE,DIR]     Regen single directory DIR on DEVICE
          [-f<c|k>PATID,file] Change/Kopy PATID of file (irreversible/once)
          [-f?file|-fu|-c#]   get UID of file|Make new UID|UID helper(0..99)
          [-ff#]              Delete old patients until #MB free
          [-gSERVER,DATE]     grab images from SERVER of date not on here
                              Otherwise: run as threaded server, port=1111(2) DGATE FileMapping         Run server child; shared memory has socket#(3) DGATE <-pPORT> <-qIP> --command:arguments
                              Send command to (this or other) running server
                              (works directly - use with care)
Delete options:
    --deleteimagefile:file                  Delete given image file from server
    --deletepatient:patid                   Delete given patient from server
    --deletestudy:patid:studyuid            Delete given study from server
    --deletestudies:date(range)             Delete studies from server on date
    --deleteseries:patid:seriesuid          Delete given series from server
    --deleteimagefromdb:file                Delete given file from db only
    --deletesopfromdb:pat,study,series,sop  Delete specified image from db only

DICOM move options:
    --movepatient:source,dest,patid         Move patient, source e.g. (local)
    --movestudy:source,dest,patid:studyuid  Move study, patid: optional
    --movestudies:source,dest,date(range)   Move studies on date
    --moveseries:src,dst,patid:seruid,stuid Move series patid: optional

Modification of dicom objects:
    --modifypatid:patid,file  Change patid of given file
    --anonymize:patid,file    Anonymize given file
    --modifyimage:file,script Change items in file
    --mergestudy:uid,uid,..   Start merging studies with given studyuids
    --mergestudyfile:file     Use to process all files to merge
    --mergeseries:uid,uid,..  Start merging series with given seriesuids
    --mergeseriesfile:file    Use to process all files to merge
    --attachanytopatient:any,sample     Modify uids to attach any object to
    --attachanytostudy:any,sample        patient|study|series in sample file
    --attachanytoseries:any,sample       Do not attach same at different levels
    --attachrtplantortstruct:plan,struc Attach rtplan to rtstruct

Maintenance options:
    --initializetables:       Clear and create database
    --initializetables:1      Clear and create database without indices
    --initializetables:2      Clear and create worklist database
    --regen:                  Re-generate entire database
    --regendevice:device      Re-generate database for single device
    --regendir:device,dir     Re-generate database for single directory
    --regenfile:file          Re-enter given file in database
    --makespace:#             Delete old patients to make #MB space
    --quit:                   Stop the server
    --safequit:               Stop the server when not active

Logging options:
    --debuglog_on:file/port   Start debug logging
    --log_on:file/port/pipe   Start normal logging
    --debuglevel:#            Set debug logging level
    --display_status:file     Display server status
    --checklargestmalloc:     Estimates DICOM object size limit
    --get_freestore:dev,fmt   Report free #Mb on device
    --testmode:#              Append # to dicom filenames
    --echo:AE,file            Echo server; show response

Configuration options:
    --get_param:name,fmt      Read any parameter from DICOM.INI
    --get_ini_param:name,fmt  Read any parameter from DICOM.INI
    --get_ini_num:index,fmt   List any entry from DICOM.INI
    --get_ini:fmt             List all entries from DICOM.INI
    --put_param:name,value    Write any parameter to DICOM.INI
    --delete_param:name       Delete any parameter from DICOM.INI
    --read_ini:               Re-read all parameters from DICOM.INI
    --get_amap:index,fmt      List any entry from ACRNEMA.MAP
    --get_amaps:fmt           List all entries from ACRNEMA.MAP
    --put_amap:i,AE,ip,p#,cmp Write entry in memory for ACRNEMA.MAP
    --delete_amap:index       Delete entry in memory for ACRNEMA.MAP
    --write_amap:             Write ACRNEMA.MAP from memory to disk
    --read_amap:              Re-read ACRNEMA.MAP from disk to memory
    --get_sop:index,fmt       List any accepted service class UID
    --put_sop:index,UID,name  Write/add accepted service class UID
    --delete_sop:index        Delete accepted service class UID
    --get_transfer:index,fmt  List any accepted transfer syntax
    --put_transfer:in,UID,nam Write/add accepted transfer syntax
    --delete_transfer:index   Delete accepted transfer syntax
    --get_application:idx,fmt List any accepted application UID
    --put_application:i,U,n   Write/add accepted application UID
    --delete_application:inde Delete accepted application UID
    --get_localae:index,fmt   List any accepted local AE title
    --put_localae:in,AE,name  Write/add accepted local AE title
    --delete_localae:index    Delete accepted local AE title
    --get_remoteae:index,fmt  List any accepted remote AE title
    --put_remoteae:in,AE,name Write/add accepted remote AE title
    --delete_remoteae:index   Delete accepted remote AE title
    --get_dic:index,fmt       List any dicom dictionary item
    --get_sqldef:level,in,fmt List any database field definition

Communication options:
    --addimagefile:file,patid      Copy file into server, optionally new patid
    --addlocalfile:file,patid      Copy local file into server, opt. new patid
    --loadanddeletedir:dir,patid   Load folder and delete its contents
    --loadhl7:file                 Load HL7 data into worklist
    --dump_header:filein,fileout   Create header dump of file
    --forward:file,mode,server     Send file with compr. mode to server
    --grabimagesfromserver:AE,date Update this server from other
    --prefetch:patientid           Prefetch all images for improved speed
    --browsepatient:searchstring   Select patient in windows GUI
    --submit:p,s,ser,sop,target,pw Immediate sftp submit of data
    --export:p,s,ser,sop,file,scr  Immediate process and zip/7z data
    --scheduletransfer:options     Background sftp transfer as above

Test options:
    --genuid:                      Generate an UID
    --changeuid:UID                Give new UID as generated now or before
    --changeuidback:UID            Give old UID from one generated above
    --checksum:string              Give checksum of string
    --testcompress:file            Enter file in server with many compressions
    --clonedb:AE                   Clone db from server for testing

Conversion options:
    --convert_to_gif:file,size,out,l/w/f Downsize and convert to mono GIF
    --convert_to_bmp:file,size,out,l/w/f Downsize and convert to color BMP
    --convert_to_jpg:file,size,out,l/w/f Downsize and convert to color JPG
    --convert_to_dicom:file,size,comp,f  Downsize/compress/frame DICOM
    --extract_frames:file,out,first,last Select frames of DICOM file
    --count_frames:file                  report # frames in DICOM file
    --uncompress:file,out                Uncompress DICOM
    --wadorequest:parameters             Internal WADO server

Database options:
    --query:table|fields|where|fmt|file Arbitrary query output to file
    --query2:tab|fld|whe|fmt|max|file   Same but limit output rows to max
    --patientfinder:srv|str|fmt|file    List patients on server
    --studyfinder:srv|str|fmt|file      List studies on server
    --seriesfinder:srv|str|fmt|file     List series on server
    --serieslister:srv|pat|stu|fmt|file List series in a study
    --imagelister:srv|pat|ser|fmt|file  List files in a series
    --extract:PatientID = 'id'          Extract all dbase tables to X..
    --extract:                          Extract patient dbase table to XA..
    --addrecord:table|flds|values       Append record, values must be in ''
    --deleterecord:table,where          Delete record from table
For DbaseIII without ODBC:
    --packdbf:                          Pack database, recreate memory index
    --indexdbf:                         Re-create memory index

Archival options:
    --renamedevice:from,to              Rename device in database
    --verifymirrordisk:device           Verify mirror disk for selected device
    --testimages:device                 Test read all images on device
    --movedatatodevice:to,from          Move patients from one device to another

    --moveseriestodevice:to,from        Move series from one device to another
    --selectlruforarchival:kb,device    Step 1 for archival: to device.Archival
    --selectseriestomove:device,age,kb  Step 1 for archival: to device.Archival
    --preparebunchforburning:to,from    Step 2 for archival: moves to cache
    --deletebunchafterburning:deviceto  Step 3 for archival: deletes from cache
    --comparebunchafterburning:deviceto Part step 3 - compare jukebox to cache
    --restoremagflags:                  Undo archival sofar

BeyondCompare – Get it

If you are ever in need of file or folder comparison on your computer, then you need BeyondCompare.  One use of BeyondCompare is to simple compare file directories that contain files with the same name, but with different file extensions.  For example I had directories of thousands of DICOM images (.dcm file extension). I ran a batch process on these images to create JPEG versions (.jpg file extension) of these same files.  The JPEG images had the same file name and directory structure.  The JPEG images were initially created in the same directories as the DICOM images and I actually used BeyondCompare to quickly move them out to their own directory.  But then I wanted to make sure all of my DICOM images had a JPEG counterpart.  I needed BeyondCompare for that as well.

In BeyondCompare 3, I aligned my DICOM (on the left) and JPEG image files (on the right) using the following session settings in the software:

Select Session > Session Settings.
Go to the Misc tab.
Click New.
For "Align left file (or folder)" enter "*.dcm".
For "with right file (or folder)" enter "*.jpg".
Click OK until you're back to the main window.

This will show you if there is a matching JPEG file name for each DICOM image.

BeyondCompare is an invaluable tool if you are a software developer, data manager, or data analyst.  Below is what the software looks like comparing folders:

Convert SAS character column to numeric

Say you have dataset with a column, we will call it ‘myVals’, with values (1, 2, 3 ,T ,N).  You want to be able to run summary statistics on ‘myVals’ easily, say in SAS Enterprise Guide, but the character values get in the way since the column is formatted as characters.

The quickest way to convert the data (ignoring the T and N values) is to simply change the select statement for the data to have the myVals field processed with the INPUT function like so:

SELECT
INPUT(myVals,BEST2.) as myVals

FROM ...

or (to preserve the character values) use:

(CASE WHEN 'T' = myVals THEN INPUT('.T',BEST2.) ELSE INPUT(myVals,BEST2.) END) as myVals

If you convert the ‘T’ and ‘N’ values to ‘.T’ and ‘.N’ then we are allowed to convert the column to numeric format in SAS.  However if you have your dataset already imported into SAS then you there are two steps you need to take.  First run a simple select query on the dataset, and create a ‘computed column’ that will recode the ‘T’ and ‘N’ values to ‘.T’ and ‘.N’,  The code for this will look like this:

(CASE
WHEN 'T' = myVals THEN '.T'
WHEN 'N' = myVals THEN '.N'

ELSE myVals
END) FORMAT=$CHAR2. AS new_myVals

The new_myVals column is still in character format at this point, so we run a second query (I know of no way to do this all in a single query) where we will now convert the new_myVals column to numeric format using:

INPUT(new_myVals, BEST2.)

NOTE: I tried running the CASE statement and then the INPUT function after it in the same query, but the INPUT function does not seem to work on calculated columns; so that is why we must create a new dataset first with the .T and .N values and then the INPUT function will work on the new (numeric friendly format) column values.  Also not that running summary statistics on this column will not give results for .T and .N values.  They remain in the dataset, however you would need to change them to numbers if you wanted to run simple summary statistics on them.

BEWARE: If you export your SAS dataset with .T and .N values stored in your new numeric formatted column using the simple Excel export, these values WILL NOT be sent to the Excel document.  They will simply be treated as blanks or empty values.  Thus, all the more reason to convert the character values to numbers.  HOWEVER, if you export the .T and .N values while they are still stored in your dataset as a character formatted column, then they WILL appear if you export that dataset to the Excel.  This is one of the problems of exporting data (and importing data between software programs)  because data me lost, unbeknownst to the person performing the export.

 

Outlook Web App – URL to email message

Sometimes you have an email that you would like to link to within your calendar (say payment confirmation for an event).  There is no good way I know of to get this URL other than open the message, go into the actions menu and choosing ‘View original message’ then copy the URL from the address bar.  NOTE: if you move the message to another folder in your email the URL will change, so I suggest moving it to a folder then grabbing the URL.

iRods training install

Go to http://irods.org/ugm2015/training-preparation/ to download the VM files.  I would suggest using 3 processors in your VM settings if you can get away with it (give it a try) to improve performance.

Start the VM and login using ‘learner’ as the password.

Open Firefox and do a Google search on ‘github irods’.  Go to the iRods Github site (https://github.com/irods).  Look under ‘docs/Admin_Guide.md’ and read through the basic instructions.

Go to the iRods download page and download the debian files for both the irods-icat and postgresql plugin (save to download folder).  Open terminal and follow the install instructions below the download links.  Change directories in the terminal to the Downloads directory where the files you just downloaded should be by entering:

cd Downloads

Next we try to install the gdebi package for handling package dependencies going forward using:

sudo apt-get install -f gdebi

Now, return back to the Admin Guide.  We need to install postGreSQL and pgAdmin3 (optional).  In the terminal window, enter:

sudo apt-get install -f postgresql

Next I try installing the odbc drivers for postgresql since I was having issues with it originally, so I use

sudo apt-get install -f odbc-postgresql

To install the GUI admin you can use

sudo apt-get install pgAdmin3

Now we have the database installed we can start with the iRods setup.  We start by entering the following into the terminal (to start a postgresql query session used to enter SQL statements). Begin with the follow database setup:

sudo su - postgres
postgres$ psql
psql> CREATE USER irods WITH PASSWORD 'testpassword';
psql> CREATE DATABASE "ICAT";
psql> GRANT ALL PRIVILEGES ON DATABASE "ICAT" TO irods;

We should at this point install the packages for irods and postgres by executing the two statements in the terminal (and answer ‘y’ at the prompt after the second statement):

sudo gdebi irods-icat-4.0.3-64bit.deb
sudo gdebi irods-database-plugin-postgres-1.4.deb

Next I ran the setup script:

sudo /var/lib/irods/packaging/setup_irods.sh

So, during previous failed setups I was concerned about my use of ‘localhost’ as the database server hostname instead of ‘127.0.0.1’ which I had used on my first successful iRods setup (as can be seen in the install video below).

———–

To check if your iRods server is running, switch to your irods account in the terminal with:

sudo su - irods

Then enter the following to see the iRods setup (assuming the server is running):

ienv

If the server information is not available then start the iRods server (switch to irods user if you have not already and start the iRods server)
sudo su - irods
./iRODS/irodsctl start

If you enter the iRods start command and receive the message that ‘Port 1247 in use’ then iRods is probably already running.  To check the ports being used enter:

sudo lsof -i

From here you should probably continue with ‘Changing the administrator account password’ section of the iRods admin documentation (note: in this section if you are changing the password then you may need to run the iinit twice since the authentication may fail the first time; doing this a second time worked for me).

To change the learner account password (used to get into the VM), in the terminal type (make sure you are exited out of the irods account):

passwd

Now let us see what users we have assigned to iRods using (you need to switch to the irods account to execute; remember ‘sudo su – irods’):

iadmin lu

If you only have a ‘rods’ account then create a new user:

iadmin mkuser [your username here] rodsuser

 

 

 

Get a list of the full network shared drive paths on your Windows machine

Sometimes our network administrators create network shared drives that auto-mount when we log onto our network domain.  To get the full list of drives mapped and the network paths, use the following command in the DOS prompt:

net use

To have the list saved to a text file use:

net use > mapped_drives.txt

[Note: the mapped_drives.txt file will be saved to the command prompt directory in DOS when you run the command shown above.  By default it probably will be ‘C:\Users\[your username]’]

Change SAS library path for a library reference (library name)

If you ever decide to move your SAS library files to another location in your file space, simply execute the following SAS program (rather than trying to find the function in SAS Enterprise Guide or whatever SAS tool you are using):

libname [name of library] '[path to the library files]';

Note: the path needs to use ‘/’ instead of ‘\’ for the directory delimiter, so your path might look something like this:
F:/My SAS libraries/library-1/