excel updates

This commit is contained in:
Allan Bowe 2020-04-16 22:06:52 +02:00
parent d074aee048
commit 198d11a9f2
5 changed files with 29 additions and 10 deletions

View File

@ -16,12 +16,12 @@ cat > $OUTFILE <<'EOL'
## Overview ## Overview
Data Controller source licences are extracted automatically from our repo using the [license-checker](https://www.npmjs.com/package/license-checker) NPM module Data Controller source licences are extracted automatically from our repo using the [license-checker](https://www.npmjs.com/package/license-checker) NPM module
<code> ```
EOL EOL
license-checker --production --relativeLicensePath --direct --start ../dcfrontend >> docs/licences.md license-checker --production --relativeLicensePath --direct --start ../dcfrontend >> docs/licences.md
echo '</code>' >> docs/licences.md echo '```' >> docs/licences.md
echo 'building mkdocs' echo 'building mkdocs'
mkdocs build --clean mkdocs build --clean

View File

@ -5,7 +5,7 @@ The backend for Data Controller consists of a set of Stored Processes, a macro l
## Regular Deployment ## Regular Deployment
1 - Import `/sas/import.spk` using SAS Management Console. Make a note of the root location in which this was deployed - as this will be added to the `metadataRoot` value in the `h54sConfig.json` file in the [frontend](dci-frontend.md#details) deployment. 1 - Import `/sas/import.spk` using SAS Management Console. Make a note of the root location in which this was deployed - as this will be added to the `metadataRoot` value in the `sasjsConfig.json` file in the [frontend](dci-frontend.md#details) deployment.
2 - Create a physical staging directory. This folder will contain the logs and CSV files generated by Users. The SAS Spawned Server account (eg `sassrv`) will need write access to this location. 2 - Create a physical staging directory. This folder will contain the logs and CSV files generated by Users. The SAS Spawned Server account (eg `sassrv`) will need write access to this location.

View File

@ -7,7 +7,7 @@ The Data Controller front end comes pre-built, and ready to deploy to the root o
1 - Unzip dcfrontend.zip and upload the entire `datacontroller` directory to the static content server. 1 - Unzip dcfrontend.zip and upload the entire `datacontroller` directory to the static content server.
2 - Open the `h54s.config` file and update the `metadataRoot` value to the location of the Stored Processes as per [backend](dci-backend.md) deployment. Remember to include the trailing slash (`/`). 2 - Open the `sasjs.config` file and update the `metadataRoot` value to the location of the Stored Processes as per [backend](dci-backend.md) deployment. Remember to include the trailing slash (`/`).
It should now be possible to use the application - simply navigate to `YO It should now be possible to use the application - simply navigate to `YO
URSASWEBLOC.domain/yourRoot/datacontroller` and sign in! URSASWEBLOC.domain/yourRoot/datacontroller` and sign in!

View File

@ -1,9 +1,11 @@
# Data Controller for SAS: File Uploads # Data Controller for SAS: File Uploads
Files can be uploaded via the Editor interface - first choose the library and table, then click "Upload". Currently only CSV files are supported, although these can be provided with non standard delimiters (such as semicolon). Files can be uploaded via the Editor interface - first choose the library and table, then click "Upload". All versions of excel are supported. If loading very large files (eg over 10mb) it is more efficient to use CSV format, as this bypasses the local rendering engine, but also the local DQ checks - so be careful! For CSV, alternative delimiters can be used (eg semicolons).
<img src="/img/dcu-files1.png" height="350" style="border:3px solid black" > <img src="/img/dcu-files1.png" height="350" style="border:3px solid black" >
## CSV Uploads
The following should be considered when uploading data in this way: The following should be considered when uploading data in this way:
- A header row (with variable names) is required - A header row (with variable names) is required
@ -20,8 +22,25 @@ The following should be considered when uploading data in this way:
- space - space
- underscore - underscore
When loading dates, be aware that the data controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats. When loading dates, be aware that the data controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats (width 19).
This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`) to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings. This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`) to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings. Note that UTC dates with offset values (eg `2018-12-26T09:19:25.123+0100`) are not currently supported. If this is a feature you would like to see, contact us.
!!! tip !!! tip
To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab
## Excel Uploads
Thanks to our pro license of [sheetJS](https://sheetjs.com/), we can support all versions of excel, and extract the data super quickly to boot.
The rules for data extraction are:
* Scan the spreadsheet until a row is found with all the target columns (with no blank cells between columns)
* Extract data below that row up until the first blank primary key value
This is incredibly flexible, and means:
* data can be anywhere, on any worksheet
* data can contain additional columns (they are just ignored)
* data can be completely surrounded by other data
A copy of the original Excel file is also uploaded to the staging area. This means that a complete audit trail can be captured, right back to the original source data.

View File

@ -4,7 +4,7 @@
## Overview ## Overview
Data Controller source licences are extracted automatically from our repo using the [license-checker](https://www.npmjs.com/package/license-checker) NPM module Data Controller source licences are extracted automatically from our repo using the [license-checker](https://www.npmjs.com/package/license-checker) NPM module
<code> ```
├─ @angular/animations@8.2.14 ├─ @angular/animations@8.2.14
│ ├─ licenses: MIT │ ├─ licenses: MIT
│ ├─ repository: https://github.com/angular/angular │ ├─ repository: https://github.com/angular/angular
@ -3829,4 +3829,4 @@ Data Controller source licences are extracted automatically from our repo using
├─ path: /Users/allan/git/dcfrontend/node_modules/zone ├─ path: /Users/allan/git/dcfrontend/node_modules/zone
└─ licenseFile: node_modules/zone/LICENSE.md └─ licenseFile: node_modules/zone/LICENSE.md
</code> ```