chore: automated commit
This commit is contained in:
		| @@ -4,6 +4,23 @@ Files can be uploaded via the Editor interface - first choose the library and ta | |||||||
|  |  | ||||||
| <img src="/img/dcu-files1.png" height="350" style="border:3px solid black" > | <img src="/img/dcu-files1.png" height="350" style="border:3px solid black" > | ||||||
|  |  | ||||||
|  | ## Excel Uploads | ||||||
|  |  | ||||||
|  | Thanks to our pro license of [sheetJS](https://sheetjs.com/), we can support all versions of excel, large workbooks, and extract data extremely fast. | ||||||
|  |  | ||||||
|  | The rules for data extraction are: | ||||||
|  |  | ||||||
|  | * Scan the spreadsheet until a row is found with all the target columns (not case sensitive) | ||||||
|  | * Extract data below that until the *first row containing a blank primary key value* | ||||||
|  |  | ||||||
|  | This is incredibly flexible, and means: | ||||||
|  |  | ||||||
|  | * data can be anywhere, on any worksheet | ||||||
|  | * data can contain additional columns (they are just ignored) | ||||||
|  | * data can be completely surrounded by other data | ||||||
|  |  | ||||||
|  | A copy of the original Excel file is also uploaded to the staging area.  This means that a complete audit trail can be captured, right back to the original source data. | ||||||
|  |  | ||||||
| ## CSV Uploads | ## CSV Uploads | ||||||
|  |  | ||||||
| The following should be considered when uploading data in this way: | The following should be considered when uploading data in this way: | ||||||
| @@ -16,31 +33,16 @@ The following should be considered when uploading data in this way: | |||||||
|  - The order of variables does not matter |  - The order of variables does not matter | ||||||
|  - The delimiter is extracted from the header row - so for `var1;var2;var3` the delimeter would be assumed to be a semicolon |  - The delimiter is extracted from the header row - so for `var1;var2;var3` the delimeter would be assumed to be a semicolon | ||||||
|  - The above assumes the delimiter is the first special character! So `var,1;var2;var3` would fail |  - The above assumes the delimiter is the first special character! So `var,1;var2;var3` would fail | ||||||
|  - The following characters should not be used as delimiters |  - The following characters should **not** be used as delimiters | ||||||
|     - doublequote |     - doublequote | ||||||
|     - quote |     - quote | ||||||
|     - space |     - space | ||||||
|     - underscore |     - underscore | ||||||
|  |  | ||||||
| When loading dates, be aware that the data controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats (width 19). | When loading dates, be aware that Data Controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats (width 19). | ||||||
| This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`) to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings.  Note that UTC dates with offset values (eg `2018-12-26T09:19:25.123+0100`) are not currently supported.  If this is a feature you would like to see, contact us. | This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`) to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings.  Note that UTC dates with offset values (eg `2018-12-26T09:19:25.123+0100`) are not currently supported.  If this is a feature you would like to see, contact us. | ||||||
|  |  | ||||||
| !!! tip | !!! tip | ||||||
|     To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab |     To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab | ||||||
|  |  | ||||||
| ## Excel Uploads |  | ||||||
|  |  | ||||||
| Thanks to our pro license of [sheetJS](https://sheetjs.com/), we can support all versions of excel, and extract the data super quickly to boot.   |  | ||||||
|  |  | ||||||
| The rules for data extraction are: |  | ||||||
|  |  | ||||||
| * Scan the spreadsheet until a row is found with all the target columns (with no blank cells between columns) |  | ||||||
| * Extract data below that row up until the first blank primary key value |  | ||||||
|  |  | ||||||
| This is incredibly flexible, and means: |  | ||||||
|  |  | ||||||
| * data can be anywhere, on any worksheet |  | ||||||
| * data can contain additional columns (they are just ignored) |  | ||||||
| * data can be completely surrounded by other data |  | ||||||
|  |  | ||||||
| A copy of the original Excel file is also uploaded to the staging area.  This means that a complete audit trail can be captured, right back to the original source data. |  | ||||||
|   | |||||||
							
								
								
									
										0
									
								
								draw/excelcost.drawio
									
									
									
									
									
										Normal file
									
								
							
							
						
						
									
										0
									
								
								draw/excelcost.drawio
									
									
									
									
									
										Normal file
									
								
							
		Reference in New Issue
	
	Block a user