Compare commits

...

327 Commits
v1.1.1 ... main

Author SHA1 Message Date
5715d17312 Merge pull request 'fix: servername' (#1) from servername into main
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m21s
Reviewed-on: #1
2025-03-11 20:47:22 +00:00
a
51b4abf66e fix: note about milliseconds 2025-03-11 20:41:24 +00:00
a
907a5d6be6 fix: servername 2025-03-10 15:59:05 +00:00
a
f96c594a8c fix: double quote
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m4s
2025-03-10 09:44:55 +00:00
a
ee0441a40d chore: updates for clarity
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m14s
2025-03-07 14:35:23 +00:00
dc
cfeeb29340 fix: clarifications
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m37s
2024-09-17 09:07:37 +01:00
dc
327cb7ccfc fix: wording
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m38s
2024-09-05 11:32:00 +02:00
dc
6cfea75681 fix: updates in docs on UPDATE
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m45s
2024-09-03 21:07:44 +02:00
^
8718396f57 adding appinfo info
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m57s
2024-05-06 23:54:20 +01:00
^
aa4854df14 fix: abort reasons
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m43s
2024-05-03 13:42:41 +01:00
^
e172464058 requests image
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m40s
2024-05-02 23:25:10 +01:00
^
50185fb534 mpe_requests
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m47s
2024-05-02 22:49:08 +01:00
^
f488a5319c fix: stuff
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m46s
2024-05-02 16:55:50 +01:00
^
4f25763489 typos
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m40s
2024-05-02 16:43:57 +01:00
^
ddc2972bd2 restore text
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Has been cancelled
2024-05-02 16:42:46 +01:00
^
ca5ae17177 restore image
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m34s
2024-05-02 16:40:52 +01:00
^
fe169c00be fix: adding image
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m40s
2024-05-02 16:36:36 +01:00
^
84b8797992 restore feature
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m42s
2024-05-02 14:09:17 +01:00
^
f3b772b83b fix: improved sas 9 config docs
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m45s
2024-03-13 20:14:57 +00:00
zmaj
b168c95b54 updated notes for hook scripts
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m47s
2024-02-06 19:08:11 +00:00
zmaj
f5eac67aff vid
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m55s
2024-01-24 20:51:46 +00:00
zmaj
48ad6e6be9 fix: roadmap
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m50s
2024-01-24 11:51:31 +00:00
zmaj
79da26a21d fix: more links
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m52s
2024-01-24 11:18:48 +00:00
zmaj
a8580b87f7 fix: links
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m42s
2024-01-24 11:16:43 +00:00
zmaj
09d302c309 fixing links
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 1m46s
2024-01-24 10:24:58 +00:00
zmaj
1549496248 formatting
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m32s
2024-01-24 09:37:49 +00:00
zmaj
8c24252cb2 fix: desc
Some checks are pending
Publish to docs.datacontroller.io / Deploy docs (push) Waiting to run
2024-01-23 21:14:23 +00:00
zmaj
44a6bb3fbf fixings
Some checks are pending
Publish to docs.datacontroller.io / Deploy docs (push) Waiting to run
2024-01-23 18:28:02 +00:00
zmaj
109c9735a5 complex excel uploads
Some checks are pending
Publish to docs.datacontroller.io / Deploy docs (push) Waiting to run
2024-01-23 17:34:36 +00:00
zmaj
81ce833ebe fix: updated roadmap to remove part about customer-funded features
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Has been cancelled
2024-01-15 16:15:40 +00:00
197c60615d Update docs/roadmap.md
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Has been cancelled
2023-12-22 15:27:32 +00:00
zver
aabb86b6f3 fix: redeploy process
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m37s
2023-10-30 22:24:24 +00:00
zver
174f01058d fix: redeploy process
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m29s
2023-10-30 22:15:26 +00:00
zver
19cbcd8e53 fix: redeploy process
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m30s
2023-10-30 22:10:46 +00:00
zver
982bd91c3b fix: redeploy process
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m42s
2023-10-30 22:04:52 +00:00
a9747b3071 feat: improved docs for admin group
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m31s
2023-10-12 13:18:11 +01:00
5b86be18b8 chore: mentioning post edit hook
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m35s
2023-10-08 17:42:53 +01:00
70e5370eb4 chore: copyright year bump + list page in sidebar
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m33s
2023-10-06 23:56:53 +01:00
24f2829222 feat: adding mpe_security to documented tables 2023-10-06 23:48:08 +01:00
9da4a2232c Update docs/macros.md
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m24s
2023-10-01 09:39:25 +00:00
54c2a2d1a2 fix: missing 't'
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m18s
2023-08-25 14:10:55 +01:00
bc494d7e90 fix: adding more info on REPLACE loadtype
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m22s
2023-08-25 12:23:52 +01:00
6db54d6b84 Update docs/dci-troubleshooting.md
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m21s
2023-08-10 08:03:44 +00:00
63a0962b3a Update docs/dci-deploysas9.md
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m17s
2023-08-02 17:31:43 +00:00
e0ae2074d1 Update .gitea/workflows/publish.yml
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m14s
2023-07-25 07:42:41 +00:00
167d7488a7 Update docs/dcc-tables.md
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 2m4s
2023-07-25 07:35:48 +00:00
fde15da7fc Update mkdocs.yml
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 2m6s
2023-07-25 07:24:37 +00:00
Mihajlo Medjedovic
1d3ec5c56b ci: cloudron surfer
All checks were successful
Publish to docs.datacontroller.io / Deploy docs (push) Successful in 2m15s
2023-07-25 00:47:21 +02:00
Mihajlo Medjedovic
ecdd19dd1c ci: chrome install
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 1m35s
2023-07-25 00:35:42 +02:00
Mihajlo Medjedovic
e044b6c294 ci: mkdocs
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 1m7s
2023-07-25 00:32:48 +02:00
Mihajlo Medjedovic
ef11e230d5 ci: docs
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 37s
2023-07-24 19:49:32 +02:00
Mihajlo Medjedovic
95caf935b0 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 36s
2023-07-24 19:48:29 +02:00
Mihajlo Medjedovic
8a8482c8c4 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 11s
2023-07-24 19:47:59 +02:00
Mihajlo Medjedovic
769d515486 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 7s
2023-07-24 19:47:12 +02:00
Mihajlo Medjedovic
359646728a ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:44:25 +02:00
Mihajlo Medjedovic
8f5701c230 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:42:33 +02:00
Mihajlo Medjedovic
148d59a3ac ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:40:25 +02:00
Mihajlo Medjedovic
4bbfa92148 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 7s
2023-07-24 19:36:14 +02:00
Mihajlo Medjedovic
5daeaf5a23 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 7s
2023-07-24 19:34:27 +02:00
Mihajlo Medjedovic
bd757e9ab1 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:33:53 +02:00
Mihajlo Medjedovic
313eed16f0 ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:33:00 +02:00
Mihajlo Medjedovic
226477486b ci: python
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 28s
2023-07-24 19:30:32 +02:00
Mihajlo Medjedovic
23c9decb6e ci: pythong
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 19:27:45 +02:00
Mihajlo Medjedovic
090f3e7fee ci: python version
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 28s
2023-07-24 19:24:45 +02:00
1487563b05 chore(docs): updates on hook behaviour
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 8s
2023-07-24 10:07:15 +01:00
c6cf00a0c3 chore(docs): updated info about hook scripts
Some checks failed
Publish to docs.datacontroller.io / Deploy docs (push) Failing after 41s
2023-07-24 10:00:08 +01:00
617ff1dffb updating README.txt
Some checks reported warnings
Publish to docs.datacontroller.io / Deploy docs (push) Has been cancelled
2023-06-30 16:43:12 +01:00
e2a73d30f7 fix: first stab at gitea action 2023-06-30 16:41:08 +01:00
Allan Bowe
71835ee670
Update dcc-tables.md 2023-05-17 11:41:03 +01:00
Allan Bowe
31913f3fe6
Update dcc-options.md 2023-05-16 12:48:33 +01:00
Allan Bowe
745bcc6b26
Update dcc-tables.md 2023-04-27 12:15:12 +01:00
Allan Bowe
7a60c3cd0f
Update dcc-tables.md 2023-04-27 11:04:58 +01:00
Allan Bowe
1e5f492976
Update dcc-tables.md 2023-04-27 10:28:35 +01:00
Allan Bowe
dbcf2236db
Update dcu-fileupload.md 2023-04-19 10:50:54 +01:00
Allan Bowe
bf50577374
Update dcu-fileupload.md 2023-04-19 10:45:18 +01:00
Allan Bowe
0d113d3211
Update dcu-tableviewer.md 2023-04-18 16:36:45 +01:00
Allan Bowe
99259e62ef
Update dcu-fileupload.md 2023-04-18 09:40:08 +01:00
Allan Bowe
8ee2f4b44f
Update dci-deploysas9.md 2023-03-30 14:12:07 +01:00
Allan Bowe
87f88c7d5c
Update dci-deploysas9.md 2023-03-30 14:11:44 +01:00
Allan Bowe
e03eb1b870
Update dci-deploysas9.md 2023-03-30 13:26:28 +01:00
Allan Bowe
667b31d222
Update macros.md 2023-03-26 23:08:16 +01:00
Allan Bowe
cb2b4c050e
Update viewboxes.md 2023-02-23 21:48:53 +00:00
Allan Bowe
1791f89ff1
Update dci-deploysas9.md 2023-02-16 14:03:12 +00:00
munja
d434d24fd0 chore: automated commit 2023-02-06 23:30:49 +00:00
munja
1c94506a81 chore: automated commit 2023-02-06 23:27:59 +00:00
munja
a0bc96b726 chore: automated commit 2023-02-06 23:22:34 +00:00
munja
de3d16dce5 chore: automated commit 2023-02-06 21:51:23 +00:00
munja
f97c6dd86a chore: automated commit 2023-02-06 21:50:36 +00:00
munja
a2b4abff2b chore: automated commit 2023-02-06 21:30:00 +00:00
munja
6e4fb12ecd chore: automated commit 2023-02-06 21:29:27 +00:00
munja
293c2d75ad chore: automated commit 2023-02-06 21:20:11 +00:00
Allan Bowe
631cbb19a1
Update dcc-options.md 2023-02-02 00:15:48 +00:00
Allan Bowe
5944c74e0f
Update dcc-options.md 2023-02-02 00:15:11 +00:00
munja
cb336a9794 chore: automated commit 2023-01-24 15:39:49 +00:00
munja
ee96a2561e feat: viewboxes 2023-01-23 14:19:16 +00:00
munja
aa513f7b12 chore: automated commit 2022-12-15 11:27:33 +01:00
munja
5a4f62e2de chore: automated commit 2022-12-14 23:22:48 +01:00
munja
affe5f40e9 chore: automated commit 2022-12-14 20:38:39 +01:00
munja
a000ae01cf chore: automated commit 2022-12-14 20:29:59 +01:00
munja
d97b1bcc03 fix: locking mechanism 2022-12-14 16:30:08 +01:00
Allan Bowe
2b5d664997
Update dci-troubleshooting.md 2022-11-18 08:50:03 +00:00
Allan Bowe
4aacea1c98
Update evaluation-agreement.md 2022-11-01 12:29:17 +00:00
Allan Bowe
9c29f3d44c fix: bowe io ltd 2022-11-01 12:27:19 +00:00
munja
bef2c0dc65 chore: automated commit 2022-10-15 00:40:35 +01:00
munja
89cf15ba55 chore: automated commit 2022-10-14 22:53:10 +01:00
munja
abf2e912b5 fix: moving pdf to theme folder 2022-10-14 22:45:33 +01:00
munja
dbd45185ce Merge branch 'master' of github.com:datacontroller/dcdocs.github.io 2022-10-14 22:40:40 +01:00
munja
17ee8cfa9f feat: if 2022-10-14 22:40:27 +01:00
Allan Bowe
72fdcaf1c7
Update dcc-validations.md 2022-10-05 14:34:31 +01:00
Allan Bowe
3249a2851c
Update formats.md 2022-08-15 11:09:43 +01:00
munja
eb6abe4719 chore: automated commit 2022-08-15 11:08:54 +01:00
Allan Bowe
c43910561d
Update formats.md 2022-08-15 10:58:00 +01:00
Allan Bowe
cac1eb8733
Update formats.md 2022-08-15 10:57:33 +01:00
Allan Bowe
320a979c4f
Update libraries.md 2022-07-25 16:25:10 +01:00
Allan Bowe
07d0c6d4dc
Update libraries.md 2022-07-25 15:03:56 +01:00
Allan Bowe
07419c4b1f
Update libraries.md 2022-07-25 15:03:24 +01:00
Allan Bowe
de15000cc2
Update dcc-security.md 2022-07-25 14:55:07 +01:00
munja
653354130e Merge branch 'master' of github.com:datacontroller/dcdocs.github.io 2022-07-25 14:49:16 +01:00
munja
b094b00e76 chore: automated commit 2022-07-25 14:49:08 +01:00
Allan Bowe
20edeb7919
Update roadmap.md 2022-07-20 19:56:46 +01:00
Allan Bowe
feb531b8c2
Update index.md 2022-07-20 19:55:31 +01:00
munja
7ec38c3a12 chore: automated commit 2022-07-15 14:26:54 +01:00
Allan Bowe
28bf4061d3
Update row-level-security.md 2022-07-11 19:30:57 +01:00
Allan Bowe
ef16696bb9
Update column-level-security.md 2022-07-11 19:04:28 +01:00
Allan Bowe
68fcfdfd3e
Update row-level-security.md 2022-07-11 19:03:58 +01:00
munja
885798691d chore: automated commit 2022-07-11 18:57:12 +01:00
munja
b01480de42 chore: automated commit 2022-07-11 16:33:23 +01:00
munja
f6bf0bb04a chore: automated commit 2022-07-11 16:18:32 +01:00
Allan Bowe
861b8900c4
Update dc-userguide.md 2022-07-11 13:53:13 +01:00
Allan Bowe
13b3d30a36
Update roadmap.md 2022-07-10 16:30:47 +01:00
Allan Bowe
6da8d7c7ef
Update dcc-groups.md 2022-07-09 23:50:25 +01:00
munja
5e63360b35 chore: automated commit 2022-07-09 23:48:49 +01:00
munja
0acbd660f5 chore: automated commit 2022-07-09 23:35:24 +01:00
Allan Bowe
60a51c94ba
Update column-level-security.md 2022-07-02 18:37:10 +01:00
Allan Bowe
9afa6f0683
Update column-level-security.md 2022-07-02 17:47:11 +01:00
Allan Bowe
7e78c66b11 fix: adding mpe config to tables in sidebar 2022-06-27 22:01:51 +00:00
Allan Bowe
3d211004e3 fix: docs for mpe_config 2022-06-27 21:58:22 +00:00
Allan Bowe
f4fe819033
Update admin-services.md 2022-06-21 16:17:59 +01:00
Allan Bowe
71e132d4a2
Update admin-services.md 2022-06-21 16:10:25 +01:00
Allan Bowe
f4d1d3f7ee
Update mpe_audit.md 2022-06-21 15:10:20 +01:00
Allan Bowe
c2472c5de7 fix: preview for mpe_audit 2022-06-21 11:46:36 +00:00
Allan Bowe
8d8eb61e95 fix: mpe_audit 2022-06-21 11:40:05 +00:00
Allan Bowe
6ffb7206d5 fix: moving mpe_tables 2022-06-18 16:55:54 +00:00
Allan Bowe
f88ac5ede2 fix: sidebar 2022-06-18 16:53:08 +00:00
Allan Bowe
3f79b89a47 feat: mpe_review 2022-06-18 16:50:50 +00:00
Allan Bowe
b743f856fc feat: mpe_submit docs 2022-06-17 18:42:40 +00:00
munja
d8e1c514bc fix: image 2022-06-17 10:28:01 +02:00
Allan Bowe
5a34f251d7
Update dynamic-cell-dropdown.md 2022-06-14 14:23:57 +01:00
Allan Bowe
e73afc7f72
Update dynamic-cell-dropdown.md 2022-06-14 14:22:48 +01:00
munja
01a9f1fca0 fix: doc updates for hook scripts 2022-06-10 19:50:08 +02:00
Allan Bowe
303679f48c
Update column-level-security.md 2022-06-09 21:07:10 +01:00
Allan Bowe
9cb813f75c
Update dcu-fileupload.md 2022-06-07 12:17:19 +01:00
Allan Bowe
8877be0f19
Update dcc-tables.md 2022-06-07 09:22:52 +01:00
Allan Bowe
87b5a425b4
Update dcc-tables.md 2022-05-23 15:08:31 +01:00
Allan Bowe
2a31e3fe2d
Update dcc-tables.md 2022-05-23 14:50:35 +01:00
Allan Bowe
98b776a8d1
Update column-level-security.md 2022-05-20 16:26:02 +01:00
munja
886bea7d66 chore: automated commit 2022-05-18 15:34:15 +01:00
munja
f24c792eb6 chore: automated commit 2022-05-18 15:33:33 +01:00
munja
18f9e8c01a fix: hooks"
\
2022-05-18 15:17:33 +01:00
munja
cb39441111 fix: hook script info 2022-05-18 14:42:46 +01:00
munja
2207d1b027 feat: column level security 2022-05-18 13:03:14 +01:00
Allan Bowe
fda029c265
Update index.md 2022-05-09 12:20:53 +01:00
munja
fe7f9a7452 chore: automated commit 2022-04-30 19:30:18 +01:00
munja
44eca4cf1e chore: automated commit 2022-04-30 19:10:20 +01:00
munja
360dd654c8 chore: automated commit 2022-04-30 18:44:41 +01:00
munja
a174a69cb8 chore: automated commit 2022-04-30 18:42:32 +01:00
munja
d2f483df03 chore: automated commit 2022-04-30 18:38:40 +01:00
munja
7d034f2bea chore: automated commit 2022-04-30 18:37:05 +01:00
munja
971f756d9e chore: automated commit 2022-04-30 18:36:00 +01:00
munja
d4537dfe13 chore: automated commit 2022-04-30 18:33:29 +01:00
munja
7caa73e350 chore: automated commit 2022-04-30 18:18:08 +01:00
Allan Bowe
aba94a06c7
Update roadmap.md 2022-04-22 14:15:04 +01:00
Allan Bowe
b993fe9e45 feat: mpe_audit table 2022-04-21 12:34:29 +00:00
Allan Bowe
a581088618
Update dcu-fileupload.md 2022-04-21 08:57:28 +01:00
munja
12f0609089 Merge branch 'master' of github.com:datacontroller/dcdocs.github.io 2022-03-23 21:52:45 +00:00
munja
83a80dc2c6 fix: images 2022-03-23 21:52:38 +00:00
Allan Bowe
11763f1b0a
Update dcc-groups.md 2022-03-12 15:37:33 +00:00
munja
236883293c chore: automated commit 2022-03-12 14:41:33 +00:00
munja
c53884dc32 fix: switching Macro People for 4GL 2022-03-12 14:27:39 +00:00
munja
1616b3d933 Merge branch 'master' of github.com:datacontroller/dcdocs.github.io 2022-03-07 20:54:35 +00:00
munja
65a52f6e8f chore: automated commit 2022-03-07 20:54:23 +00:00
Allan Bowe
551acb85d7
Update dynamic-cell-dropdown.md 2022-03-05 11:09:06 +00:00
munja
ab4de7010d fix: format image 2022-02-27 21:59:51 +00:00
munja
bf146c2375 fix: update to include formats and special SAS missing numerics in the docs 2022-02-27 19:37:35 +00:00
Allan Bowe
7acb205d81
Update dcu-fileupload.md 2022-02-24 11:36:57 +00:00
munja
82b2a752b3 chore: automated commit 2022-02-02 23:54:38 +01:00
munja
1f450bc0ef chore: automated commit 2022-02-02 18:23:51 +01:00
munja
f3da34a50d feat: formats plan 2022-01-27 20:59:42 +01:00
munja
f261a716be chore: automated commit 2022-01-17 22:58:32 +01:00
munja
b56cf28a46 feat: api features 2022-01-17 22:13:36 +01:00
Allan Bowe
bc2a7e2a3c
Update dcc-validations.md 2022-01-17 10:51:59 +00:00
Allan Bowe
fea3a0055a chore: automated commit 2021-11-11 14:19:04 +00:00
Allan Bowe
f07e8d92d1 chore: automated commit 2021-11-11 14:12:06 +00:00
Allan Bowe
eeb8f360a2 chore: automated commit 2021-11-11 14:00:25 +00:00
Allan Bowe
6218e442b2 chore: automated commit 2021-10-28 10:47:20 +01:00
Allan Bowe
c066a1ab00 chore: automated commit 2021-10-28 10:31:32 +01:00
Allan Bowe
06cabc8d29
Update roadmap.md 2021-10-25 15:25:44 +03:00
Allan Bowe
61c23722cc chore: automated commit 2021-10-21 10:55:56 +01:00
Allan Bowe
35ad0f288f chore: automated commit 2021-10-21 10:54:08 +01:00
Allan Bowe
e138d11e33 chore: automated commit 2021-10-20 17:22:07 +01:00
Allan Bowe
ac62ced4c4 chore: automated commit 2021-10-20 15:27:29 +01:00
Allan Bowe
a3839fe2fb
Merge pull request #8 from medjedovicm/patch-2
Update roadmap.md
2021-10-20 14:58:23 +01:00
Mihajlo Medjedovic
fbb6177ab1
Update roadmap.md
Added estimates
2021-10-20 15:48:08 +02:00
Allan Bowe
6fc61e603f chore: automated commit 2021-10-20 10:47:22 +01:00
Allan Bowe
146b85d3e0 chore: automated commit 2021-10-19 17:12:16 +01:00
Allan Bowe
2334103a8d chore: automated commit 2021-10-19 16:51:21 +01:00
Allan Bowe
7e619ca575 chore: automated commit 2021-10-19 16:47:33 +01:00
Allan Bowe
d0ae861e87 chore: automated commit 2021-10-09 16:47:17 +01:00
Allan Bowe
dcf2b11f24 chore: automated commit 2021-10-09 16:45:16 +01:00
Allan Bowe
f5d644cdeb chore: automated commit 2021-10-09 16:43:26 +01:00
Allan Bowe
8f8a2d9dc0 chore: automated commit 2021-10-09 16:42:31 +01:00
Allan Bowe
bcd0e913f6 chore: automated commit 2021-10-09 16:39:29 +01:00
Allan Bowe
e298d9d8b9
Update dcu-fileupload.md 2021-10-07 18:43:22 +03:00
Allan Bowe
418c76cb04 chore: automated commit 2021-09-30 14:01:15 +01:00
Allan Bowe
9f8474695e chore: automated commit 2021-09-09 13:34:59 +03:00
Allan Bowe
cf4d117286 chore: automated commit 2021-09-07 14:51:19 +03:00
Allan Bowe
7e9e2cf8f3 chore: automated commit 2021-09-07 14:19:30 +03:00
Allan Bowe
97d709a061 chore: automated commit 2021-09-06 21:29:03 +03:00
Allan Bowe
653bb8be50 chore: automated commit 2021-09-06 14:57:25 +03:00
Allan Bowe
7139f37368 chore: automated commit 2021-09-06 14:55:18 +03:00
Allan Bowe
d90844a43d chore: automated commit 2021-09-06 14:44:44 +03:00
Allan Bowe
2c1ca60439 chore: automated commit 2021-09-06 13:46:50 +03:00
Allan Bowe
9d407e809d chore: automated commit 2021-09-06 13:43:36 +03:00
Allan Bowe
82deeaa2e9 chore: automated commit 2021-09-06 12:00:16 +03:00
Allan Bowe
75f336587e chore: automated commit 2021-09-06 11:54:31 +03:00
Allan Bowe
7b6b2a8d3c chore: automated commit 2021-08-24 19:01:14 +03:00
Allan Bowe
76e16d49e0 chore: automated commit 2021-08-24 00:02:24 +03:00
Allan Bowe
82ba4da9cb chore: automated commit 2021-08-23 23:56:28 +03:00
Allan Bowe
dbd4c6aead
Update videos.md 2021-08-07 22:03:08 +03:00
Allan Bowe
3da9e73ede chore: automated commit 2021-07-26 10:22:13 +03:00
Allan Bowe
8108b9ed57 chore: automated commit 2021-07-25 15:40:13 +03:00
Allan Bowe
80ddd83003 chore: automated commit 2021-07-25 15:20:45 +03:00
Allan Bowe
cf4a47430f
Update dci-deploysasviya.md 2021-07-23 13:22:47 +03:00
Allan Bowe
84f825c444
Update row-level-security.md 2021-07-06 00:56:29 +03:00
Allan Bowe
b8d04772b4 chore: automated commit 2021-07-04 15:33:21 +03:00
Allan Bowe
2c45c99497
Update index.md 2021-07-04 15:21:11 +03:00
Allan Bowe
f5da5148da chore: automated commit 2021-06-11 12:20:09 +03:00
Allan Bowe
01dac2ffef chore: automated commit 2021-06-11 12:11:15 +03:00
Allan Bowe
0a02f44892 chore: automated commit 2021-05-25 23:53:31 +03:00
Allan Bowe
e15dac7e25 chore: automated commit 2021-05-25 23:48:28 +03:00
Allan Bowe
20582b8fe5 chore: automated commit 2021-05-25 23:44:41 +03:00
Allan Bowe
6a82a052cf chore: automated commit 2021-05-25 23:43:57 +03:00
Allan Bowe
32b169bb01 chore: automated commit 2021-05-25 23:41:12 +03:00
Allan Bowe
d3d2806dd9 chore: automated commit 2021-05-25 23:37:41 +03:00
Allan Bowe
dc42c418d4 chore: automated commit 2021-05-25 23:33:12 +03:00
Allan Bowe
ae4ee9dd10 chore: automated commit 2021-05-25 23:32:42 +03:00
Allan Bowe
d51f46247c chore: automated commit 2021-05-22 15:37:21 +03:00
Allan Bowe
09cff61cc5 chore: automated commit 2021-05-13 21:09:38 +03:00
Allan Bowe
77095c788c chore: automated commit 2021-05-13 14:33:06 +03:00
Allan Bowe
3e96e9b5d7 chore: automated commit 2021-05-13 11:36:39 +03:00
Allan Bowe
67aa898a31 chore: automated commit 2021-05-13 10:26:50 +03:00
Allan Bowe
31927aa62e chore: automated commit 2021-05-12 23:22:35 +03:00
Allan Bowe
88f10cb680 chore: automated commit 2021-05-12 22:29:24 +03:00
Allan Bowe
fc4b5d839c chore: automated commit 2021-05-12 21:42:21 +03:00
Allan Bowe
f8563d270b chore: automated commit 2021-05-12 17:53:38 +03:00
Allan Bowe
19fd6089ce chore: automated commit 2021-05-12 16:34:35 +03:00
Allan Bowe
7ba9c78980 Merge branch 'master' of github.com:macropeople/dcdocs.github.io 2021-05-12 00:36:40 +03:00
Allan Bowe
2dff6c0295 chore: automated commit 2021-05-12 00:36:34 +03:00
allanbowe
8b14d4526b chore: automated commit 2021-05-01 10:56:25 +02:00
Beast
7b225cf29b chore: automated commit 2021-04-30 21:27:38 +03:00
allanbowe
83ec14e3b7 chore: automated commit 2021-04-30 15:18:41 +02:00
Beast
520b8d3585 feat: innovation 2021-04-30 15:49:04 +03:00
allanbowe
89f15d0be2 chore: automated commit 2021-04-30 12:15:33 +02:00
allanbowe
b466160367 chore: automated commit 2021-04-28 20:31:31 +02:00
allanbowe
02130cd138 chore: automated commit 2021-04-26 21:32:13 +02:00
allanbowe
4e5e4b917e feat: new troubleshooting section 2021-04-26 21:30:12 +02:00
Allan Bowe
fbbf3ebaf0 chore: automated commit 2021-03-21 19:49:27 +01:00
Allan Bowe
83f9b1768c chore: automated commit 2021-03-21 16:09:06 +01:00
Allan Bowe
9a8a79f9c3 chore: automated commit 2021-03-20 14:56:16 +01:00
Allan Bowe
c793589f90 chore: automated commit 2021-03-14 19:43:06 +01:00
Allan Bowe
9ebaf030af chore: automated commit 2021-03-14 18:46:22 +01:00
Allan Bowe
86533ae733 chore: automated commit 2021-03-13 11:37:30 +01:00
Allan Bowe
e12fc140f5 chore: automated commit 2021-03-12 23:21:55 +01:00
Allan Bowe
1dbabf54a1 chore: automated commit 2021-03-12 22:33:33 +01:00
Allan Bowe
8fc08c56b4 chore: automated commit 2021-03-11 23:56:50 +01:00
Allan Bowe
1053f3df76 chore: automated commit 2021-03-11 23:56:00 +01:00
Allan Bowe
0027189b74 chore: automated commit 2021-03-11 23:31:07 +01:00
Allan Bowe
cd17e6a970 chore: automated commit 2021-03-11 22:03:58 +01:00
Allan Bowe
bc5e579465 chore: automated commit 2021-03-11 21:56:31 +01:00
Allan Bowe
0dbf891fb9
Update macros.md 2021-02-22 09:01:07 +01:00
Allan Bowe
02f5311749 another commit 2021-02-21 16:35:10 +01:00
Allan Bowe
1077f239b9 another commit 2021-02-21 16:34:15 +01:00
Allan Bowe
ef95bc0a9d another commit 2021-02-17 18:42:36 +01:00
Allan Bowe
0d825b4572 another commit 2021-02-17 16:58:25 +01:00
Yury Shkoda
904f4004f5
Merge pull request #6 from medjedovicm/patch-1
Update dci-deploysasviya.md
2021-02-17 16:44:44 +03:00
Yury Shkoda
5ecf290a32
chore: improved the grammar 2021-02-17 16:44:05 +03:00
Mihajlo Medjedovic
13056fefce
Update dci-deploysasviya.md 2021-02-17 13:05:22 +01:00
Mihajlo Medjedovic
b9e180b84c
Update dci-deploysasviya.md 2021-02-17 12:57:30 +01:00
Allan Bowe
0f1bf15d14 another commit 2021-02-14 19:55:17 +01:00
Allan Bowe
b41d859112 another commit 2021-02-14 19:53:33 +01:00
Allan Bowe
171fcdab05 feat: viya deploy@ 2021-01-05 18:48:13 +00:00
Allan Bowe
6cc7186f99 fix: doc update 2020-12-20 10:59:31 +00:00
vrh
454ea7351a fix: starting on dci viya deploy 2020-10-02 22:35:55 +02:00
vrh
5116d1a45a docs: viya 2020-10-01 19:22:31 +02:00
vrh
ca88c9e9b0 fix: adding sas 9 deploy info 2020-09-26 16:51:37 +02:00
vrh
242956bed6 adding vids 2020-08-30 19:29:36 +02:00
vrh
a0f62fd0ba index update 2020-08-27 18:05:41 +02:00
Allan Bowe
15dea29c23 Merge branch 'master' of github.com:macropeople/dcdocs.github.io 2020-08-20 09:41:43 +02:00
Allan Bowe
46eac2ba1c flyers 2020-08-20 09:39:17 +02:00
vrh
26779f31b0 video updates 2020-08-19 12:41:35 +02:00
Allan Bowe
404b35616f fix: small correction 2020-08-06 13:13:02 +02:00
vrh
ae3d9645ed removing package lock 2020-08-05 20:14:08 +02:00
vrh
2cbf0b557e fix: updates for SEO and new videos page 2020-08-05 20:13:19 +02:00
Allan Bowe
1d988416ee fixes 2020-07-25 00:53:48 +02:00
Allan Bowe
23ab2580f2 docs: deploy info 2020-07-15 12:33:21 +02:00
Allan Bowe
904f6ee81b jobmeta update 2020-05-31 18:05:51 +02:00
Allan Bowe
360426733a adding dcu_flow image and updating stp instance docs 2020-05-26 09:43:42 +02:00
Allan Bowe
080b9115fa doc updates 2020-05-13 10:25:53 +02:00
Allan Bowe
2987333ce9 feat: datacatalog info 2020-05-07 00:16:01 +02:00
Allan Bowe
53f071a0b7 docs: viewer 2020-04-24 22:31:16 +02:00
Allan Bowe
198d11a9f2 excel updates 2020-04-16 22:06:52 +02:00
Allan Bowe
d074aee048 docs: licences update 2020-04-16 10:44:57 +02:00
Allan Bowe
edb5b46655
Merge pull request #3 from macropeople/dependabot/npm_and_yarn/lodash.template-4.5.0
Bump lodash.template from 4.4.0 to 4.5.0
2020-04-13 21:07:13 +02:00
Allan Bowe
9c5eab7b7a
Merge pull request #2 from macropeople/dependabot/npm_and_yarn/handlebars-4.7.3
Bump handlebars from 4.0.12 to 4.7.3
2020-04-13 21:06:52 +02:00
Allan Bowe
621eac84b2
Merge pull request #1 from macropeople/dependabot/npm_and_yarn/lodash-4.17.15
Bump lodash from 4.17.11 to 4.17.15
2020-04-13 21:06:36 +02:00
Allan Bowe
0ab3584bc1 Privacy Policy 2020-04-13 21:01:06 +02:00
Allan Bowe
68004c5eea update doc 2020-03-30 12:56:09 +02:00
Allan Bowe
c7bc1649df new dc image 2020-03-30 12:53:02 +02:00
Allan Bowe
31598b6e8b fix: validations documentation 2020-03-26 19:59:11 +01:00
Allan Bowe
85de43ac5b feat: validations 2020-03-26 19:19:33 +01:00
Allan Bowe
e4da4e2088 docs: flyer 2020-03-13 22:50:42 +01:00
Allan Bowe
0208738dc3 adding options page 2020-03-10 21:39:00 +01:00
dependabot[bot]
eee3a54cef
Bump lodash.template from 4.4.0 to 4.5.0
Bumps [lodash.template](https://github.com/lodash/lodash) from 4.4.0 to 4.5.0.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.4.0...4.5.0)

Signed-off-by: dependabot[bot] <support@github.com>
2020-02-09 16:44:39 +00:00
dependabot[bot]
08d26af4ec Bump handlebars from 4.0.12 to 4.7.3
Bumps [handlebars](https://github.com/wycats/handlebars.js) from 4.0.12 to 4.7.3.
- [Release notes](https://github.com/wycats/handlebars.js/releases)
- [Changelog](https://github.com/wycats/handlebars.js/blob/master/release-notes.md)
- [Commits](https://github.com/wycats/handlebars.js/compare/v4.0.12...v4.7.3)

Signed-off-by: dependabot[bot] <support@github.com>
2020-02-09 16:44:38 +00:00
dependabot[bot]
7ae91cb0c8
Bump lodash from 4.17.11 to 4.17.15
Bumps [lodash](https://github.com/lodash/lodash) from 4.17.11 to 4.17.15.
- [Release notes](https://github.com/lodash/lodash/releases)
- [Commits](https://github.com/lodash/lodash/compare/4.17.11...4.17.15)

Signed-off-by: dependabot[bot] <support@github.com>
2020-02-09 16:44:38 +00:00
Allan Bowe
f7c3dfe38e fix: links 2020-02-05 18:58:12 +01:00
Allan Bowe
dc3458b05c feat: new VIEW security 2020-02-05 18:54:55 +01:00
198 changed files with 43763 additions and 9455 deletions

View File

@ -0,0 +1,53 @@
name: Publish to docs.datacontroller.io
on:
push:
branches:
- main
jobs:
build:
name: Deploy docs
runs-on: ubuntu-latest
steps:
- uses: actions/setup-node@v3
with:
node-version: 18
- name: Checkout master
uses: actions/checkout@v2
- name: Setup Python
uses: actions/setup-python@v4
env:
AGENT_TOOLSDIRECTORY: /opt/hostedtoolcache
RUNNER_TOOL_CACHE: /opt/hostedtoolcache
- name: Install pip3
run: |
apt-get update
apt-get install python3-pip -y
- name: Install Chrome
run: |
apt-get update
wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
apt install -y ./google-chrome*.deb;
export CHROME_BIN=/usr/bin/google-chrome
- name: Install Surfer
run: |
npm -g install cloudron-surfer
- name: build site
run: |
pip3 install mkdocs
pip3 install mkdocs-material
pip3 install fontawesome_markdown
pip3 install mkdocs-redirects
python3 -m mkdocs build --clean
mkdir site/slides
npx @marp-team/marp-cli slides/innovation/innovation.md -o ./site/slides/innovation/index.html
npx @marp-team/marp-cli slides/if/if.md -o site/if.pdf --allow-local-files --html=true
- name: Deploy docs
run: surfer put --token ${{ secrets.SURFERKEY }} --server docs.datacontroller.io site/* /

1
.gitignore vendored
View File

@ -1,3 +1,4 @@
site/ site/
*.swp *.swp
node_modules/ node_modules/
**/.DS_Store

8
.gitpod.yml Normal file
View File

@ -0,0 +1,8 @@
# This configuration file was automatically generated by Gitpod.
# Please adjust to your needs (see https://www.gitpod.io/docs/config-gitpod-file)
# and commit this file to your remote git repository to share the goodness with others.
tasks:
- init: npm install

5
.vscode/extensions.json vendored Normal file
View File

@ -0,0 +1,5 @@
{
"recommendations": [
"marp-team.marp-vscode"
]
}

View File

@ -1,26 +0,0 @@
# Change Log
All notable changes to this project will be documented in this file. See [standard-version](https://github.com/conventional-changelog/standard-version) for commit guidelines.
<a name="1.1.1"></a>
## [1.1.1](https://github.com/macropeople/dcdocs.github.io/compare/v1.1.0...v1.1.1) (2020-01-25)
<a name="1.1.0"></a>
# [1.1.0](https://github.com/macropeople/dcdocs.github.io/compare/v1.0.0...v1.1.0) (2019-01-26)
### Features
* **git:** adding standard-version ([9433758](https://github.com/macropeople/dcdocs.github.io/commit/9433758))
<a name="1.0.0"></a>
# 1.0.0 (2019-01-26)
### Features
* **build:** new build script" ([bf8eda7](https://github.com/macropeople/dcdocs.github.io/commit/bf8eda7))

View File

@ -1,15 +1,3 @@
Installation Deploy steps in build.sh
1. brew install mkdocs
2. pip install mkdocs-material
3. pip install fontawesome_markdown
To build, navigate to the root of this repo and run:
mkdocs build --clean
mkdocs serve
To deploy, run the following:
`rsync -avz --exclude .git/ --del -e "ssh -p 722" ~/git/dcdocs/site/ macropeo@77.72.0.226:/home/macropeo/docs.datacontroller.io`
Note that the readme has been renamed to README.txt to prevent github pages from considering it to be the index by default! Note that the readme has been renamed to README.txt to prevent github pages from considering it to be the index by default!

View File

@ -4,9 +4,35 @@
#################################################################### ####################################################################
## Create regular mkdocs docs ## Create regular mkdocs docs
echo 'extracting licences'
OUTFILE='docs/licences.md'
cat > $OUTFILE <<'EOL'
<!-- this page is AUTOMATICALLY updated!! -->
# Data Controller for SAS® - Source Licences
## Overview
Data Controller source licences are extracted automatically from our repo using the [license-checker](https://www.npmjs.com/package/license-checker) NPM module
```
EOL
license-checker --production --relativeLicensePath --direct --start ../dcfrontend >> docs/licences.md
echo '```' >> docs/licences.md
echo 'building mkdocs' echo 'building mkdocs'
mkdocs build --clean pip3 install mkdocs
pip3 install mkdocs-material
pip3 install fontawesome_markdown
python3 -m mkdocs build --clean
#mkdocs serve #mkdocs serve
rsync -avz --exclude .git/ --del -e "ssh -p 722" ~/git/dcdocs/site/ macropeo@77.72.0.226:/home/macropeo/docs.datacontroller.io # update slides
mkdir site/slides
npx @marp-team/marp-cli slides/innovation/innovation.md -o ./site/slides/innovation/index.html
rsync -avz --exclude .git/ --del -e "ssh -p 722" site/ macropeo@77.72.0.226:/home/macropeo/docs.datacontroller.io

98
docs/admin-services.md Normal file
View File

@ -0,0 +1,98 @@
---
layout: article
title: Admin Services
description: Data Controller contains a number of admin-only web services, such as DB Export, Lineage Generation, and Data Catalog refresh.
---
# Admin Services
Several web services have been defined to provide additional functionality outside of the user interface. These somewhat-hidden services must be called directly, using a web browser.
In a future version, these features will be made available from an Admin screen (so, no need to manually modify URLs).
The URL is made up of several components:
* `SERVERURL` -> the domain (and port) on which your SAS server resides
* `EXECUTOR` -> Either `SASStoredProcess` for SAS 9, else `SASJobExecution` for Viya
* `APPLOC` -> The root folder location in which the Data Controller backend services were deployed
* `SERVICE` -> The actual Data Controller service being described. May include additional parameters.
To illustrate the above, consider the following URL:
[https://viya.4gl.io/SASJobExecution/?_program=/Public/app/viya/services/admin/exportdb&flavour=PGSQL](https://viya.4gl.io/SASJobExecution/?_program=/Public/app/viya/services/admin/exportdb&flavour=PGSQL
)
This is broken down into:
* `$SERVERURL` = `https://sas.analytium.co.uk`
* `$EXECUTOR` = `SASJobExecution`
* `$APPLOC` = `/Public/app/dc`
* `$SERVICE` = `services/admin/exportdb&flavour=PGSQL`
The below sections will only describe the `$SERVICE` component - you may construct this into a URL as follows:
* `$SERVERURL/$EXECUTOR?_program=$APPLOC/$SERVICE`
## Export Config
This service will provide a zip file containing the current database configuration. This is useful for migrating to a different data controller database instance.
EXAMPLE:
* `services/admin/exportconfig`
## Export Database
Exports the data controller control library in DB specific DDL. The following URL parameters may be added:
* `&flavour=` (only PGSQL supported at this time)
* `&schema=` (optional, if target schema is needed)
EXAMPLES:
* `services/admin/exportdb&flavour=PGSQL&schema=DC`
* `services/admin/exportdb&flavour=PGSQL`
## Refresh Data Catalog
In any SAS estate, it's unlikely the size & shape of data will remain static. By running a regular Catalog Scan, you can track changes such as:
- Library Properties (size, schema, path, number of tables)
- Table Properties (size, number of columns, primary keys)
- Variable Properties (presence in a primary key, constraints, position in the dataset)
The data is stored with SCD2 so you can actually **track changes to your model over time**! Curious when that new column appeared? Just check the history in [MPE_DATACATALOG_TABS](/tables/mpe_datacatalog_tabs).
To run the refresh process, just trigger the stored process, eg below:
* `services/admin/refreshcatalog`
* `services/admin/refreshcatalog&libref=MYLIB`
The optional `&libref=` parameter allows you to run the process for a single library. Just provide the libref.
When doing a full scan, the following LIBREFS are ignored:
* 'CASUSER'
* 'MAPSGFK'
* 'SASUSER'
* 'SASWORK
* 'STPSAMP'
* 'TEMP'
* `WORK'
Additional LIBREFs can be excluded by adding them to the `DCXXXX.MPE_CONFIG` table (where `var_scope='DC_CATALOG' and var_name='DC_IGNORELIBS'`). Use a pipe (`|`) symbol to seperate them. This can be useful where there are connection issues for a particular library.
Be aware that the scan process can take a long time if you have a lot of tables!
Output tables (all SCD2):
* [MPE_DATACATALOG_LIBS](/tables/mpe_datacatalog_libs) - Library attributes
* [MPE_DATACATALOG_TABS](/tables/mpe_datacatalog_tabs) - Table attributes
* [MPE_DATACATALOG_VARS](/tables/mpe_datacatalog_vars) - Column attributes
* [MPE_DATASTATUS_LIBS](/tables/mpe_datastatus_libs) - Frequently changing library attributes (such as size & number of tables)
* [MPE_DATASTATUS_TABS](/tables/mpe_datastatus_tabs) - Frequently changing table attributes (such as size & number of rows)
## Update Licence Key
Whenever navigating Data Controller, there is always a hash (`#`) in the URL. To access the licence key screen, remove all content to the RIGHT of the hash and add the following string: `/licensing/update`.
If you are using https protocol, you will have 2 keys (licence key / activation key). In http mode, there is just one key (licence key) for both boxes.

102
docs/api.md Normal file
View File

@ -0,0 +1,102 @@
---
layout: article
title: API
description: The Data Controller API provides a machine-programmable interface for loading spreadsheets into SAS
---
!!! warning
Work in Progress!
# API
Where a project has a requirement to load Excel Files automatically into SAS, from a remote machine, an API approach is desirable for many reasons:
* Security. Client access can be limited to just the endpoints they need (rather than being granted full server access).
* Flexibility. Well documented, stable APIs allow consumers to build and extend additional products and solutions.
* Cost. API solutions are typically self-contained, quick to implement, and easy to learn.
A Data Controller API would enable teams across an entire enterprise to easily and securely send data to SAS in a transparent and fully automated fashion.
The API would also benefit from all of Data Controllers existing [data validation](https://docs.datacontroller.io/dcc-validations/) logic (both frontend and backend), data auditing, [alerts](https://docs.datacontroller.io/emails/), and [security control](https://docs.datacontroller.io/dcc-security/) features.
It is however, a significant departure from the existing "SAS Content" based deployment, in the following ways:
1. Server Driven. A machine is required on which to launch, and run, the API application itself.
2. Fully Automated. There is no browser, or interface, or - human, involved.
3. Extends outside of SAS. There are firewalls, and authentication methods, to consider.
The Data Controller technical solution will differ, depending on the type of SAS Platform being used. There are three types of SAS Platform:
1. Foundation SAS - regular, Base SAS.
2. SAS EBI - with Metadata.
3. SAS Viya - cloud enabled.
And there are three main options when it comes to building APIs on SAS:
1. Standalone DC API (Viya Only). Viya comes with [REST APIs](https://developer.sas.com/apis/rest/) out of the box, no middleware needed.
2. [SAS 9 API](https://github.com/analytium/sas9api). This is an open-source Java Application, using SAS Authentication.
3. [SASjs Server](https://github.com/sasjs/server). An open source NodeJS application, compatible with all major authentication methods and all versions of SAS
An additional REST API option for SAS EBI might have been [BI Web Services](https://documentation.sas.com/doc/en/bicdc/9.4/bimtag/p1acycjd86du2hn11czxuog9x0ra.htm), however - it requires platform changes and is not highly secure.
The compatibility matrix is therefore as follows:
| Product | Foundation SAS| SAS EBI | SAS VIYA |
|---|---|---|---|
| DCAPI | ❌ | ❌ | ✅ |
| DCAPI + SASjs Server | ✅ | ✅ | ✅ |
| DCAPI + SAS 9 API | ❌ | ✅ | ❌ |
In all cases, a Data Controller API will be surfaced, that makes use of the underlying (raw) API server.
The following sections break down these options, and the work remaining to make them a reality.
## Standalone DC API (Viya Only)
For Viya, the investment necessary is relatively low, thanks to the API-first nature of the platform. In addition, the SASjs framework already provides most of the necessary functionality - such as authentication, service execution, handling results & logs, etc. Finally, the Data Controller team have already built an API Bridge (specific to another customer, hence the building blocks are in place).
The work to complete the Viya version of the API is as follows:
* Authorisation interface
* Creation of API services
* Tests & Automated Deployments
* Developer docs
* Swagger API
* Public Documentation
Cost to complete - £5,000 (Viya Only)
## SASjs Server (Foundation SAS)
[SASjs Server](https://github.com/sasjs/server) already provides an API interface over Foundation SAS. An example of building a web app using SASjs Server can be found [here](https://www.youtube.com/watch?v=F23j0R2RxSA). In order for it to fulfill the role as the engine behind the Data Controller API, additional work is needed - specifically:
* Secure (Enterprise) Authentication
* Users & Groups
* Feature configuration (ability to restrict features to different groups)
On top of this, the DC API part would cover:
* Authorisation interface
* Creation of API services
* Tests & Automated Deployments
* Developer docs
* Swagger API
* Public Documentation
Cost to complete - £10,000 (fixed)
Given that all three SAS platforms have Foundation SAS available, this option will work everywhere. The only restriction is that the sasjs/server instance **must** be located on the same server as SAS. `
## SAS 9 API (SAS EBI)
This product has one major benefit - there is nothing to install on the SAS Platform itself. It connects to SAS in much the same way as Enterprise Guide (using the SAS IOM).
Website: [https://sas9api.io](https://sas9api.io)
Github: [https://github.com/analytium/sas9api](https://github.com/analytium/sas9api)
The downside is that the features needed by Data Controller are not present in the API. Furthermore, the tool is not under active development. To build out the necessary functionality, it will require us to source a senior Java developer on a short term contract to first, understand the tool, and secondly, to update it in a sustainable way.
We estimate the cost to build Data Controller API on this mechanism at £20,000 - but it could be higher.

View File

@ -0,0 +1,99 @@
---
layout: article
title: Column Level Security
description: Column Level Security prevents end users from viewing or editing specific columns in SAS according to their group membership.
og_image: https://docs.datacontroller.io/img/cls_table.png
---
# Column Level Security
Column level security is implemented by mapping _allowed_ columns to a list of SAS groups. In VIEW mode, only allowed columns are visible. In EDIT mode, allowed columns are _editable_ - the remaining columns are read-only.
Below is an example of an EDIT table with only one column enabled for editing:
![lockanytable example](/img/cls_example.png)
See also: [Row Level Security](/row-level-security/).
## Configuration
The variables in MPE_COLUMN_LEVEL_SECURITY should be configured as follows:
### CLS_SCOPE
Determines whether the rule applies to the VIEW page, the EDIT page, or ALL pages. The impact of the rule varies as follows:
#### VIEW Scope
When `CLS_SCOPE in ('VIEW','ALL')` then only the listed columns are _visible_ (unless `CLS_HIDE=1`)
#### EDIT Scope
When `CLS_SCOPE in ('EDIT','ALL')` then only the listed columns are _editable_ (the remaining columns are read-only, and visible). Furthermore:
* The user will be unable to ADD or DELETE records.
* Primary Key values are always read only
* Primary Key values cannot be hidden (`CLS_HIDE=1` will have no effect)
### CLS_GROUP
The SAS Group to which the rule applies. The user could also be a member of a [DC group](/dcc-groups).
- If a user is in ANY of the groups, the columns will be restricted.
- If a user is in NONE of the groups, no restrictions apply (all columns available).
- If a user is in MULTIPLE groups, they will see all allowed columns across all groups.
- If a user is in the [Data Controller Admin Group](/dcc-groups/#data-controller-admin-group), CLS rules DO NOT APPLY.
### CLS_LIBREF
The library of the target table against which the security rule will be applied
### CLS_TABLE
The target table against which the security rule will be applied
### CLS_VARIABLE_NM
This is the name of the variable against which the security rule will be applied. Note that
### CLS_ACTIVE
If you would like this rule to be applied, be sure this value is set to 1.
### CLS_HIDE
This variable can be set to `1` to _hide_ specific variables, which allows greater control over the EDIT screen in particular. CLS_SCOPE behaviour is impacted as follows:
* `ALL` - the variable will not be visible in either VIEW or EDIT.
* `EDIT` - the variable will not be visible. **Cannot be applied to a primary key column**.
* `VIEW` - the variable will not be visible. Can be applied to a primary key column. Simply omitting the row, or setting CLS_ACTIVE to 0, would result in the same behaviour.
It is possible that a variable can have multiple values for CLS_HIDE, eg if a user is in multiple groups, or if different rules apply for different scopes. In this case, if the user is any group where this variable is NOT hidden, then it will be displayed.
## Example Config
Example values as follows:
|CLS_SCOPE:$4|CLS_GROUP:$64|CLS_LIBREF:$8| CLS_TABLE:$32|CLS_VARIABLE_NM:$32|CLS_ACTIVE:8.|CLS_HIDE:8.|
|---|---|---|---|---|---|---|
|EDIT|Group 1|MYLIB|MYDS|VAR_1|1||
|ALL|Group 1|MYLIB|MYDS|VAR_2|1||
|ALL|Group 2|MYLIB|MYDS|VAR_3|1||
|VIEW|Group 1|MYLIB|MYDS|VAR_4|1||
|EDIT|Group 1|MYLIB|MYDS|VAR_5|1|1|
If a user is in Group 1, and viewing `MYLIB.MYDS` in EDIT mode, **all** columns will be visible but only the following columns will be editable:
* VAR_1
* VAR_2
The user will be unable to add or delete rows.
If the user is in both Group 1 AND Group 2, viewing `MYLIB.MYDS` in VIEW mode, **only** the following columns will be visible:
* VAR_2
* VAR_3
* VAR_4
## Video Example
This short video does a walkthrough of applying Column Level Security from end to end.
<iframe width="560" height="315" src="https://www.youtube.com/embed/jAVt-omtjVc" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

View File

@ -21,6 +21,24 @@ There are 5 roles identified for users of the Data Controller:
4. *Auditor*. An auditor has the ability to review the [history](dc-userguide.md#history) of changes to a particular table. 4. *Auditor*. An auditor has the ability to review the [history](dc-userguide.md#history) of changes to a particular table.
5. *Administrator*. An administrator has the ability to add new [tables](dcc-tables.md) to the Data Controller, and to configure the security settings (at metadata group level) as required. 5. *Administrator*. An administrator has the ability to add new [tables](dcc-tables.md) to the Data Controller, and to configure the security settings (at metadata group level) as required.
## What is a submission?
The submission is the data that has been staged for approval. Note - submissions are never applied automatically! They must always be approved by 1 or more approvers first. The process of submission varies according to the type of submit.
### Web Submission
When using the Web editor, a frontend check is made against the subset of data that was filtered for editing to see which rows are new / modified / marked deleted. Only those changed rows (from the extract) are submitted to the staging area.
### Excel Submission
When importing an excel file, all rows are loaded into the web page. You have an option to edit those records. If you edit them, the original excel is discarded, and only changed rows are submitted (it becomes a web submission). If you hit SUBMIT immediately, then ALL rows are staged, and a copy of the excel file is uploaded for audit purposes.
### CSV submission
A CSV upload bypasses the part where the records are loaded into the web page, and ALL rows are sent to the staging area directly. This makes it suitable for larger uploads.
## Edit Stage Approve Workflow
Up to 500 rows can be edited (in the web editor) at one time. These edits are submitted to a staging area. After one or more approvals (acceptances) the changes are applied to the source table.
![screenshot](img/dcu_flow.png)
## Use Case Diagram ## Use Case Diagram
There are five roles (Viewer, Editor, Approver, Auditor, Administrator) which correspond to 5 primary use cases (View Table, Edit Table, Approve Change, View Change History, Configure Table) There are five roles (Viewer, Editor, Approver, Auditor, Administrator) which correspond to 5 primary use cases (View Table, Edit Table, Approve Change, View Change History, Configure Table)

View File

@ -4,7 +4,7 @@
The Data Controller has 5 tabs, as follows: The Data Controller has 5 tabs, as follows:
* *[Viewer](#viewer)*. This tab lets users view any table to which they have been granted access in metadata. They can also download the data as csv, excel, or as a SAS program (datalines). * *[Viewer](#viewer)*. This tab lets users view any table to which they have been granted access in metadata. They can also download the data as csv, excel, or as a SAS program (datalines). Primary key fields are coloured green.
* *[Editor](#editor)*. This tab enables users to add, modify or delete data. This can be done directly in the browser, or by uploading a CSV file. Values can also be copy-pasted from a spreadsheet. Once changes are ready, they can be submitted, with a corresponding reason. * *[Editor](#editor)*. This tab enables users to add, modify or delete data. This can be done directly in the browser, or by uploading a CSV file. Values can also be copy-pasted from a spreadsheet. Once changes are ready, they can be submitted, with a corresponding reason.
* *[Submitted](#submitted)*. This shows and editor the outstanding changes that have been submitted for approval (but have not yet been approved or rejected). * *[Submitted](#submitted)*. This shows and editor the outstanding changes that have been submitted for approval (but have not yet been approved or rejected).
* *[Approvals](#approvals)*. This shows an approver all their outstanding approval requests. * *[Approvals](#approvals)*. This shows an approver all their outstanding approval requests.
@ -40,7 +40,7 @@ The Editor screen lets users who have been pre-authorised (via the `DATACTRL.MPE
1 - *Filter*. The user can filter before proceeding to perform edits. 1 - *Filter*. The user can filter before proceeding to perform edits.
2 - *Upload*. If you have a lot of data, you can [upload it directly](dcu-fileupload). The changes are then approved in the usual way. 2 - *Upload*. If you have a lot of data, you can [upload it directly](files). The changes are then approved in the usual way.
3 - *Edit*. This is the main interface, data is displayed in tabular format. The first column is always "Delete?", as this allows you to mark rows for deletion. Note that removing a row from display does not mark it for deletion! It simply means that this row is not part of the changeset being submitted. 3 - *Edit*. This is the main interface, data is displayed in tabular format. The first column is always "Delete?", as this allows you to mark rows for deletion. Note that removing a row from display does not mark it for deletion! It simply means that this row is not part of the changeset being submitted.
The next set of columns are the Primary Key, and are shaded grey. If the table has a surrogate / retained key, then it is the Business Key that is shown here (the RK field is calculated / updated at the backend). For SCD2 type tables, the 'validity' fields are not shown. It is assumed that the user is always working with the current version of the data, and the view is filtered as such. The next set of columns are the Primary Key, and are shaded grey. If the table has a surrogate / retained key, then it is the Business Key that is shown here (the RK field is calculated / updated at the backend). For SCD2 type tables, the 'validity' fields are not shown. It is assumed that the user is always working with the current version of the data, and the view is filtered as such.
@ -51,6 +51,10 @@ New rows can be added using the right click context menu, or the 'Add Row' butto
When ready to submit, hit the SUBMIT button and enter a reason for the change. The owners of the data are now alerted (so long as their email addresses are in metadata) with a link to the approve screen. When ready to submit, hit the SUBMIT button and enter a reason for the change. The owners of the data are now alerted (so long as their email addresses are in metadata) with a link to the approve screen.
If you are also an approver you can approve this change yourself. If you are also an approver you can approve this change yourself.
#### Special Missings
Data Controller supports special missing numerics, ie - a single letter or underscore. These should be submitted _without_ the leading period. The letters are not case sensitive.
#### BiTemporal Tables #### BiTemporal Tables
The Data Controller only permits BiTemporal data uploads at a single point in time - so for convenience, when viewing data in the edit screen, only the most recent records are displayed. To edit earlier records, either use file upload, or apply a filter. The Data Controller only permits BiTemporal data uploads at a single point in time - so for convenience, when viewing data in the edit screen, only the most recent records are displayed. To edit earlier records, either use file upload, or apply a filter.

View File

@ -2,21 +2,45 @@
## Overview ## Overview
Dates & datetimes are actually stored as plain numerics in regular SAS tables. In order for the Data Controller to recognise these values as dates / datetimes a format must be applied. Dates & datetimes are stored as plain numerics in regular SAS tables. In order for the Data Controller to recognise these values as dates / datetimes a format must be applied.
![displayed](img/dcc-dates1.png) ![source](img/dcc-dates2.png) ![displayed](img/dcc-dates1.png) ![source](img/dcc-dates2.png)
This format must also be present / updated in the metadata view of the (physical) table to be displayed properly. This can be done using DI Studio, or by running the following (template) code: Supported date formats:
``` * DATE.
* DDMMYY.
* MMDDYY.
* YYMMDD.
* E8601DA.
* B8601DA.
* NLDATE.
Supported datetime formats:
* DATETIME.
* NLDATM.
Supported time formats:
* TIME.
* HHMM.
In SAS 9, this format must also be present / updated in the metadata view of the (physical) table to be displayed properly. This can be done using DI Studio, or by running the following (template) code:
```sas
proc metalib; proc metalib;
omr (library="Your Library"); omr (library="Your Library");
folder="/Shared Data/your table storage location"; folder="/Shared Data/table storage location";
update_rule=(delete); update_rule=(delete);
run; run;
``` ```
!!! note
Data Controller does not support decimals when EDITING. For datetimes, this means that values must be rounded to 1 second (milliseconds are not supported).
If you have other dates / datetimes / times you would like us to support, do [get in touch](https://datacontroller.io/contact)!

View File

@ -1,4 +1,24 @@
# Data Controller for SAS® - Adding Groups ---
layout: article
title: Groups
description: By default, Data Controller will work with the SAS Groups defined in Viya, Metadata, or SASjs Server. It is also possible to define custom groups with Data Controller itself.
og_image: https://i.imgur.com/drGQBBV.png
---
# Adding Groups
## Overview ## Overview
By default, Data Controller will work with the SAS Groups defined in metadata. It is also possible to define custom groups with Data Controller itself - to do this simply add the user and group name (and optionally, a group description) in the `DATACTRL.MPE_GROUPS` table. By default, Data Controller will work with the SAS Groups defined in Viya, Metadata, or SASjs Server. It is also possible to define custom groups with Data Controller itself - to do this simply add the user and group name (and optionally, a group description) in the `DATACTRL.MPE_GROUPS` table.
![](https://i.imgur.com/drGQBBV.png)
## Data Controller Admin Group
When configuring Data Controller for the first time, a group is designated as the 'admin' group. This group has unrestricted access to Data Controller. To change this group, modify the `%let dc_admin_group=` entry in the settings program, located as follows:
* **SAS Viya:** $(appLoc)/services/settings.sas
* **SAS 9:** $(appLoc)/services/public/Data_Controller_Settings
* **SASjs Server:** $(appLoc)/services/public/settings.sas
To prevent others from changing this group, ensure the Data Controller appLoc (deployment folder) is write-protected - eg RM (metadata) or using Viya Authorisation rules.

69
docs/dcc-options.md Normal file
View File

@ -0,0 +1,69 @@
---
layout: article
title: DC Options
description: Options in Data Controller are set in the MPE_CONFIG table and apply to all users.
og_title: Data Controller for SAS® Options
og_image: /img/mpe_config.png
---
# Data Controller for SAS® - Options
The [MPE_CONFIG](/tables/mpe_config/) table provides a number of system options, which apply to all users. The table may be re-purposed for other applications, so long as scopes beginning with "DC_" are avoided.
Currently used scopes include:
* DC
* DC_CATALOG
## DC Scope
### DC_EMAIL_ALERTS
Set to YES or NO to enable email alerts. This requires email options to be preconfigured (mail server etc).
### DC_MAXOBS_WEBEDIT
By default, a maximum of 100 observations can be edited in the browser at one time. This number can be increased, but note that the following factors will impact performance:
* Number of configured [Validations](/dcc-validations)
* Browser type and version (works best in Chrome)
* Number (and size) of columns
* Speed of client machine (laptop/desktop)
### DC_REQUEST_LOGS
On SASjs Server and SAS9 Server types, at the end of each DC SAS request, a record is added to the [MPE_REQUESTS](/tables/mpe_requests) table. In some situations this can cause table locks. To prevent this issue from occuring, the `DC_REQUEST_LOGS` option can be set to `NO` (Default is `YES`).
### DC_RESTRICT_EDITRECORD
Setting YES will prevent the EDIT RECORD dialog appearing in the EDIT screen by removing the "Edit Row" option in the right click menu, and the "ADD RECORD" button in the bottom left.
Anything other than YES will mean that the modal _is_ available.
Default=NO
### DC_RESTRICT_VIEWER
Set to YES to restrict the list of libraries and tables in VIEWER to only those explicitly set to VIEW in the MPE_SECURITY table. The default is NO (users can see all tables they already have permission to see).
### DC_VIEWLIB_CHECK
Set to YES to enable library validity checking in viewLibs service. This means that on first load, SAS will attempt to open each library to see if it is possible to do so. This reduces the number of libraries in the list, but means that it is slow to load the first time around.
The default is NO.
### DC_LOCALE
Set to a locale (such as `en_gb` or `en_be`) to override the system value (which may be derived from client browser settings).
This feature is useful when importing ambiguous dates from CSV or Excel (eg 1/2/20 vs 2/1/20) as DC uses the `anydtdtm.` informats for import.
Default=SYSTEM.
!!! note
If you have clients in different geographies loading excel in local formats, you can also address this issue by ensuring the locale of the windows _user_ profile is not set to the default (eg `English (United States)`). When leaving the DC_LOCALE as SYSTEM, the locale settings in SAS are not added or modified.
## DC_CATALOG Scope
### DC_IGNORELIBS
When running the [Refresh Data Catalog](/admin-services/#refresh-data-catalog) service, it is often that case the the process will fail due to being unable to assign a library. To avoid the need to resolve the connection issue elsewhere in SAS, you can simply exclude it from the Data Catalog, by including the LIBREF in this field (pipe-separated)
## DC_REVIEW Scope
### HISTORY_ROWS
Number of rows to return for each HISTORY page. Default - 100. Increasing this will increase for all users. Using very large numbers here can result in a sluggish page load time. If you need large amounts of HISTORY data, it is generally better to extract it directly from the [MPE_REVIEW](/tables/mpe_review/) table.

View File

@ -3,11 +3,11 @@
## Summary ## Summary
DC security is applied at the level of Table and Group. Permissions can only be set at group level. There are two parts to adding a user: DC security is applied at the level of Table and Group. Permissions can only be set at group level. There are two parts to adding a user:
1 - Adding the user to the relevant group in SAS metadata 1 - Adding the user to the relevant [group](/dcc-groups)
2 - Ensuring that group has the appropriate access level in the configuration table 2 - Ensuring that group has the appropriate access level in the MPE_SECURITY table
For guidance with adding SAS users, see [SAS Documentation](http://support.sas.com/documentation/cdl/en/mcsecug/69854/HTML/default/viewer.htm#n05epzfefjyh3dn1xdw2lkaxwyrz.htm). For guidance with adding SAS users in SAS 9, see [SAS Documentation](http://support.sas.com/documentation/cdl/en/mcsecug/69854/HTML/default/viewer.htm#n05epzfefjyh3dn1xdw2lkaxwyrz.htm).
## Details ## Details
@ -15,26 +15,28 @@ In order to surface a table to a new group, simply add a record to the `DATACTRL
![Screenshot](img/securitytable.png) ![Screenshot](img/securitytable.png)
## EDIT vs APPROVE ## ACCESS_LEVEL
The `EDIT` permission determines which groups will be able to upload CSVs and submit changes via the web interface for that table. The `APPROVE` permission determines which groups will be able to approve those changes, and hence enable the target table to be loaded. If you wish to have members of a particular group both edit AND approve, then two lines (one for each group) must be entered, per table. ### EDIT
The `EDIT` permission determines which groups will be able to upload CSVs and submit changes via the web interface for that table.
### APPROVE
The `APPROVE` permission determines which groups will be able to approve those changes, and hence enable the target table to be loaded. If you wish to have members of a particular group both edit AND approve, then two lines (one for each group) must be entered, per table.
### VIEW
The default behaviour when installing Data Controller is that the [viewer](dcu-tableviewer.md) lets all SAS Users see all the tables that they are authorised to view in SAS. However there may be reasons to further restrict the tables in this component.
There is a global setting that will disable ALL tables in VIEWER unless explicitly authorised - this is available in MPE_CONFIG. Set `DC_RESTRICT_VIEWER=YES`, submit, and approve.
If authorising groups without this setting, it means that tables will be restricted only in that library (the rest will still be visible).
Groups can be given VIEW access for all libraries or all tables within a library by using the keyword `*ALL*` instead of the libref / tablename.
It's also worth being aware of the `DC_VIEWLIB_CHECK` option in MPE_CONFIG. When this is switched on, SAS will confirm that the library is valid and contains tables, before adding to the list. This can sometimes be slow (depending on your library configurations), hence disabled - but as the list is actually cached on frontend (until the next hard refresh) the impact may worth it.
## Determining Group Members ## Determining Group Members
Before adding a group to Data Controller, it helps to know the members of that group! The following options are available: Before adding a group to Data Controller, it helps to know the members of that group! A User navigator is available in both the SAS 9 and Viya version of Data Controller. You can navigate Users, Groups and Roles (roles are only visible in the SAS 9 version).
1 - Use SAS Management Console This means you do not need SAS Management Console or SAS Environment Manager to manage Data Controller users. However you will need those tools for managing SAS Groups, unless you define your own groups in the [MPE_GROUPS](dcc-groups.md) table.
2- Use Code
The "code" option can be performed as follows:
```
/* get macro library */
filename mc url "https://raw.githubusercontent.com/macropeople/macrocore/master/macrocore.sas";
%inc mc;
/* call macro */
%mm_getgroupmembers(YOURGROUPNAME)
/* the above will create a dataset containing the group members */
```

View File

@ -1,91 +1,224 @@
---
layout: article
title: MPE_TABLES
description: Adding tables to the Data Controller is a matter of configuration, specifically the addition of a new record to `DATACTRL.MPE_TABLES`, and corresponding entries in `DATACTRL.MPE_SECURITY`.
og_image: https://i.imgur.com/DtVU62u.png
---
# Data Controller for SAS® - Adding Tables # Data Controller for SAS® - Adding Tables
## Overview ## Overview
Adding tables to the Data Controller is a matter of configuration, specifically the addition of a new record to the `DATACTRL.MPE_TABLES` table, and corresponding entries in the `DATACTRL.MPE_SECURITY` table. Adding tables to the Data Controller is a matter of configuration, specifically the addition of a new record to the `DATACTRL.MPE_TABLES` table, and corresponding entries in the `DATACTRL.MPE_SECURITY` table.
!!! note !!! note
In order to surface the table to users, appropriate groups should be configured as per [security](dcc-security.md) settings. In order to surface the table to (non admin) users, appropriate groups should be configured as per [security](dcc-security.md) settings.
![screenshot](img/configtable.png) ![screenshot](img/configtable.png)
## MPE_TABLES Configuration Details ## MPE_TABLES Configuration Details
Each table to be edited in the Data Controller is represented by one record in `DATACTRL.MPE_TABLES`. The fields should be populated as follows: Each table to be edited in the Data Controller is represented by one record in `DATACTRL.MPE_TABLES`. The fields should be populated as follows:
### LIBREF ### LIBREF
The libref of the table. If not pre-assigned, DC will assign it at runtime using the first definition found in metadata, using this [macro](https://github.com/macropeople/macrocore/blob/master/meta/mm_assigndirectlib.sas).
The libref of the table. If not pre-assigned, and the serverType is SAS 9 (EBI), DC will assign it at runtime using the first definition found in metadata, using this [macro](https://core.sasjs.io/mm__assigndirectlib_8sas.html).
### DSN ### DSN
The dataset (table) name as visible when assigning a direct libref connection to `LIBREF`.
The dataset (table) name as visible when assigning a direct libref connection to `LIBREF`. If the target is a format catalog, it should have a "-FC" suffice (eg `FORMATS-FC`). More info on formats [here](formats.md).
### NUM_OF_APPROVALS_REQUIRED ### NUM_OF_APPROVALS_REQUIRED
This is an integer representing the number of approvals required before a table is updated. This mechanism lets you insist on, for example, 2 or 3 approvals before sensitive data is updated following a submission. Note that only one rejection is ever necessary to remove the submission.
This is an integer representing the number of approvals required before a table is updated. This mechanism lets you insist on, for example, 2 or 3 approvals before sensitive data is updated following a submission. Note that only one rejection is ever necessary to remove the submission.
This is a required field. This is a required field.
### LOADTYPE ### LOADTYPE
The loadtype determines the nature of the update to be applied. Valid values are as follows:
* UPDATE. This is the most basic type, and any updates will happen 'in place'. Simply provide the primary key fields in the `BUSKEY` column. The loadtype determines the nature of the update to be applied. Valid values are as follows:
* TXTEMPORAL. This signifies an SCD2 type load. For this type the validity fields (valid from, valid to) should be specified in the `VAR_TXFROM` and `VAR_TXTO` fields. The table itself should include `VAR_TXFROM` in the physical key. The remainder of the primary key fields (not including `VAR_TXFROM`) should be specified in `BUSKEY`.
* BITEMPORAL. These tables have two time dimensions - a version history, and a business history. The version history (SCD2) fields should be specified in `VAR_TXFROM` and `VAR_TXTO` and the business history fields should be specified in `VAR_BUSFROM` and `VAR_BUSTO`. The `VAR_TXFROM` and `VAR_BUSFROM` fields should be in the key of the actual table, but should not be also specified in the `BUSKEY` field. - UPDATE. This is the most basic type - simply provide the primary key fields in the `BUSKEY` column.
- FORMAT_CAT. For updating Format Catalogs, the BUSKEY should be `FMTNAME START`. See [formats](/formats).
- TXTEMPORAL. This signifies an SCD2 type load. For this type the validity fields (valid from, valid to) should be specified in the `VAR_TXFROM` and `VAR_TXTO` fields. The table itself should include `VAR_TXFROM` in the physical key. The remainder of the primary key fields (not including `VAR_TXFROM`) should be specified in `BUSKEY`.
- BITEMPORAL. These tables have two time dimensions - a version history, and a business history. The version history (SCD2) fields should be specified in `VAR_TXFROM` and `VAR_TXTO` and the business history fields should be specified in `VAR_BUSFROM` and `VAR_BUSTO`. Both the `VAR_TXFROM` and `VAR_BUSFROM` fields should be in the physical key of the actual table, but should NOT be specified in the `BUSKEY` field.
- REPLACE. This loadtype simply deletes all the rows and appends the staged data. Changes are NOT added to the audit table. In the diff screen, previous rows are displayed as deleted, and staged rows as new (modified values are not displayed). Can be useful for updating single-row tables.
This is a required field. This is a required field.
!!! Note !!! Note
The support for BITEMPORAL loads is restricted, in the sense it is only possible to load data at a single point in time (no support for loading multiple business date ranges for a single business key). The workaround is simply to load each date range separately. The support for BITEMPORAL loads is restricted, in the sense it is only possible to load data at a single point in time (no support for loading _multiple_ business date ranges for a _specific_ BUSKEY). The workaround is simply to load each date range separately. As a result of this restriction, the EDIT page will only show the latest business date range for each key. To modify earlier values, a filter should be applied.
!!! Warning
If your target table contains referential constraints (eg primary key values that are linked to a child table with a corresponding foreign key) then this will cause problems with the UPDATE and REPLACE load types. This is due to the fact these both involve delete operations. If removal of these constraints is not an option, the workaround would be to create a separate (mirror) table, and update that using PRE-EDIT and POST-APPROVE hook scripts. Please contact Data Controller support for advice / assistance.
### BUSKEY ### BUSKEY
The business (natural) key of the table. For SCD2 / Bitemporal, this does NOT include the validity dates. For Retained / Surrogate key tables, this contains the actual surrogate key - the underlying fields that are used to create the surrogate key are specified in [RK_UNDERLYING](#rk_underlying).
The business (natural) key of the table. For SCD2 / Bitemporal, this does NOT include the validity dates. For Retained / Surrogate key tables, this contains the actual surrogate key - the underlying fields that are used to create the surrogate key are specified in [RK_UNDERLYING](#rk_underlying).
This is a required field. This is a required field.
### VAR_TXFROM / VAR_TXTO ### VAR_TXFROM / VAR_TXTO
The SCD2 type validity dates, representing the point in time at which the field was created (`VAR_TXFROM`) and when it was closed out (`VAR_TXTO`) from a change or deletion. If the record is active, the `VAR_TXTO` field would contain a high value. `VAR_TXFROM` is a part of the physical key of the underlying table.
The SCD2 type validity dates, representing the point in time at which the field was created (`VAR_TXFROM`) and when it was closed out (`VAR_TXTO`) from a change or deletion. If the record is active, the `VAR_TXTO` field would contain a high value. `VAR_TXFROM` is a part of the physical key of the underlying table.
These fields should contain the NAME of the variables which contain the open / close timestamps in the underlying table. These fields should contain the NAME of the variables which contain the open / close timestamps in the underlying table.
Leave blank if not required. Leave blank if not required.
### VAR_BUSFROM / VAR_BUSTO ### VAR_BUSFROM / VAR_BUSTO
The BITEMPORAL _business_ dates which represent the reporting period to which the record is valid. Typically these contain _date_ values (rather than _datetime_ values). If variables are specified here, then the [LOADTYPE](#loadtype) should be `BITEMPORAL`.
The BITEMPORAL _business_ dates which represent the reporting period to which the record is valid. Typically these contain _date_ values (rather than _datetime_ values). If variables are specified here, then the [LOADTYPE](#loadtype) should be `BITEMPORAL`.
Leave blank if not required. Leave blank if not required.
### VAR_PROCESSED ### VAR_PROCESSED
Set the name of a variable (eg `processed_dttm`) which should be given a current timestamp whenever the table is updated. Set the name of a variable (eg `processed_dttm`) which should be given a current timestamp whenever the table is updated.
Leave blank if not required. Leave blank if not required.
### CLOSE_VARS ### CLOSE_VARS
By default, the Data Controller will only process the records that are part of a changeset. This means that records should be explicity marked for deletion. But what if you are performing a reload of a monthly batch, and the _absence_ of a record implies that it is no longer required? For this scenario, it is necessary to specify the range within a 'complete' load is expected. For instance, by reporting month, or month + product. When performing loads, the DC will then first extract a distinct list of values for this key and close them out in the target table, before performing the upload. The `CLOSE_VARS` are typically a subset of the [BUSKEY](#buskey) fields.
By default, the Data Controller will only process the records that are part of a changeset. This means that records should be explicity marked for deletion. But what if you are performing a reload of a monthly batch, and the _absence_ of a record implies that it is no longer required? For this scenario, it is necessary to specify the range within a 'complete' load is expected. For instance, by reporting month, or month + product. When performing loads, the DC will then first extract a distinct list of values for this key and close them out in the target table, before performing the upload. The `CLOSE_VARS` are typically a subset of the [BUSKEY](#buskey) fields.
Leave blank if not required. Leave blank if not required.
### PRE_EDIT_HOOK ### PRE_EDIT_HOOK
The full path / location (unquoted) of a SAS program that will be `%inc`'d prior to an edit being made. This allows a particular view of a table to be presented to a user for editing (eg masking columns etc).
[Hook script](#hook-scripts) to execute _prior_ to an edit being made. This allows data to be modified before being presented for editing, or for display formats to be applied.
Leave blank if not required. Leave blank if not required.
SAS Developer Notes:
* Target dataset: `work.OUT`
* Filters will have been applied, and table sorted on [BUSKEY](#buskey)
* Base libref.table or catalog variable: `&orig_libds`
### POST_EDIT_HOOK ### POST_EDIT_HOOK
The full path / location (unquoted) of a SAS program that will be `%inc`'d after an edit has been made. This program can modify the dataset (`work.staging&x`) that is created in the staging area, which is useful for augmenting data / applying complex DQ rules. If your DQ check means that the program should not be submitted, then simply exit with `syscc > 4` .
[Hook script](#hook-scripts) to execute _after_ an edit has been made. Useful when there is a need to augment data (derived / calculated columns), or perform advanced data quality checks prior to approval.
Leave blank if not required. Leave blank if not required.
SAS Developer Notes:
* Staged dataset: `work.STAGING_DS`
* Target libref.table or catalog variable: `&orig_libds`
If your DQ check means that the program should not be submitted, then simply exit with `&syscc > 4`. You can even set a message to go back to the user by using the [mp_abort](https://core.sasjs.io/mp__abort_8sas.html) macro:
```
%mp_abort(iftrue= (&syscc ne 0) /* if this condition is true, the process will exit */
,msg=%str(YOUR MESSAGE GOES HERE)
)
```
### PRE_APPROVE_HOOK ### PRE_APPROVE_HOOK
The full path / location (unquoted) of a SAS program that will be `%inc`'d before an approval diff is generated. This modifies the value that is presented to an approver on the approve screen, and can be helpful in terms of ensuring that information is presented in way that can be easily consumed by approvers.
This [hook script](#hook-scripts) will execute twice during a typical workflow - firstly, before the approval diff is generated, and again after the approval (not rejection) and _before_ the change is applied.
This makes it a helpful place to prevent changes being made, eg in situations where the target table needs to be locked by alternative systems.
It can also be used to apply display formats, or to prepare any derived 'system' columns such as "LAST_APPROVER_NM".
Leave blank if not required. Leave blank if not required.
### POST_APPROVE HOOK SAS Developer Notes:
The full path / location (unquoted) of a SAS program that will be `%inc`'d after an approval is made. This is the most common type of hook script, and is useful for, say, running a SAS job after a mapping table is updated, or running a model after changing a parameter.
* Staged dataset: `work.STAGING_DS`
* Target libref.table or catalog variable: `&orig_libds`
### POST_APPROVE_HOOK
This [hook script](#hook-scripts) is `%inc`'d _after_ an approval is made. This is the most common type of hook script, and is useful for, say, running a SAS job after a mapping table is updated, or running a model after changing a parameter.
Leave blank if not required. Leave blank if not required.
SAS Developer Notes:
At the point of running this script, the data has already been loaded (successfully) to the target table. Therefore the target is typically the base libref.table (or format catalog) itself and can be referenced directly (YOURLIB.YOURDATASET), or using either of the following macro variable:
* `&orig_libds`
* `&libref..&ds`
The staged table is also available, as `work.STAGING_DS`.
If you are making changes to the target table as part of the hook, then in order to prevent contention from other users making concurrent edits, you are advised to "LOCK" and "UNLOCK" it using the [mp_lockanytable](https://core.sasjs.io/mp__lockanytable_8sas.html) macro:
```
/* lock SOMELIB.SOMETABLE */
%mp_lockanytable(LOCK,
lib=SOMELIB,
ds=SOMETABLE,
ref=Locking table to peform a post approve hook action
ctl_ds=&mpelib..mpe_lockanytable
)
/* do stuff */
proc sort data=somelib.sometable;
run;
/* unlock */
%mp_lockanytable(UNLOCK,
lib=SOMELIB,
ds=SOMETABLE,
ctl_ds=&mpelib..mpe_lockanytable
)
```
The SAS session will already contain the mp_lockanytable macro definition.
### SIGNOFF_COLS ### SIGNOFF_COLS
Used to determine a range (eg reporting month) to which a 'final version' can be marked. This allows a particular version of data to be marked as final, meaning that the data can continue to change afterwards (reports can simply query for the timestamp of the 'final' version of the data).
Used to determine a range (eg reporting month) to which a 'final version' can be marked. This allows a particular version of data to be marked as final, meaning that the data can continue to change afterwards (reports can simply query for the timestamp of the 'final' version of the data).
Leave blank if not required. Leave blank if not required.
### SIGNOFF_HOOK ### SIGNOFF_HOOK
The full path / location (unquoted) of a SAS program that will be `%inc`'d after a 'final version' has been signed off.
This [hook script](#hook-scripts) is `%inc`'d after a 'final version' has been signed off.
Leave blank if not required. Leave blank if not required.
### NOTES ### NOTES
Content entered here will be displayed to the approver on signoff. Content entered here will be displayed to the approver on signoff.
Not required, but recommended. Not required, but recommended.
### RK_UNDERLYING ### RK_UNDERLYING
For retained / surrogate keys, an auto-incrementing field is used to represent each unique record. In this case, the RK (integer) field itself should be added in the [BUSKEY](#buskey) column, and the natural / underlying key should be added here.
For retained / surrogate keys, an auto-incrementing field is used to represent each unique record. In this case, the RK (integer) field itself should be added in the [BUSKEY](#buskey) column, and the natural / underlying key should be added here.
Leave blank unless using retained / surrogate keys. Leave blank unless using retained / surrogate keys.
### HELPFUL_LINK ### AUDIT_LIBDS
If more information is available to describe the table being updated (eg on sharepoint), provide a url here and it will be made available to approvers.
Leave blank if not required. If this field is blank (ie empty, missing), **every** change is captured in the [MPE_AUDIT](/tables/mpe_audit). This can result in large data volumes for frequently changing tables.
Alternative options are:
1. Enter a zero (`0`) to switch off audit logging completely
2. Enter a library.dataset reference of an alternative audit table in which to capture the change history.
For option 2, the base table structure can be generated using this macro: [https://core.sasjs.io/mddl__dc__difftable_8sas_source.html](https://core.sasjs.io/mddl__dc__difftable_8sas_source.html).
## HOOK Scripts
Data Controller allows SAS programs to be executed at certain points in the ingestion lifecycle, such as:
* Before an edit (to control the edit screen)
* After an edit (perform complex data quality)
* Before an approval (control the approve screen)
* After an approval (trigger downstream jobs with new data)
The code is simply `%include`'d at the relevant point during backend execution. The program may be:
* Physical, ie the full path to a `.sas` program on the physical server directory
* Logical, ie a Viya Job (SAS Drive), SAS 9 Stored Process (Metadata Folder) or SASJS Stored Program (SASjs Drive).
If the entry ends in `".sas"` it is assumed to be a physical, filesystem file. Otherwise, the source code is extracted from SAS Drive or Metadata.
To illustrate:
* Physical filesystem (ends in .sas): `/opt/sas/code/myprogram.sas`
* Logical filesystem: `/Shared Data/stored_processes/mydatavalidator`
!!! warning
Do not place your hook scripts inside the Data Controller (logical) application folder, as they may be inadvertently lost during a deployment (eg in the case of a backup-and-deploy-new-instance approach).

56
docs/dcc-validations.md Normal file
View File

@ -0,0 +1,56 @@
---
layout: article
title: Data Validation
description: Quality in, Quality out! Enforce data quality checks at the point of SAS data entry, both directly via the web interface and also via Excel uploads.
og_image: https://i.imgur.com/P64ijBB.png
---
# Data Controller for SAS® - DQ Validations
## Overview
Quality in, Quality out! Data Controller lets you enforce quality checks at the point of data entry, both directly via the web interface and also via Excel uploads.
## Default Checks
By default, the following frontend rules are always applied:
* Length checking per target table variable lengths
* Type checking per target table datatypes (Character, Numeric, Date, Time, Datetime)
* Not Null check per target table constraints
* Primary Key checking per business key defined in MPE_TABLES
It is possible to configure a number of other rules by updating the MPE_VALIDATIONS table. Simply set the `BASE_LIB`, `BASE_DS` and `BASE_COL` values, and ensure `RULE_ACTIVE=1` for it to be applied.
## Configurable Checks
Check back frequently as we plan to keep growing this list of checks.
|Rule Type|Example Value |Description|
|---|---|---|
|CASE|UPCASE|Will enforce the case of cell values. Valid values: UPCASE, LOWCASE, PROPCASE|
|NOTNULL|(defaultval)|Will prevent submission if null values are present. Optional - provide a default value.|
|MINVAL|1|Defines a minimum value for a numeric cell|
|MAXVAL|1000000|Defines a maximum value for a numeric cell|
|HARDSELECT|sashelp.class.name|A distinct list of values (max 1000) are taken from this library.member.column reference, and the value **must** be in this list. This list may be supplemented by entries in the MPE_SELECTBOX table.|
|SOFTSELECT|dcdemo.mpe_tables.libref|A distinct list of values (max 1000) are taken from this library.member.column reference, and the user-provided value may (or may not) be in this list. This list may be supplemented by entries in the MPE_SELECTBOX table.|
|[HARDSELECT_HOOK](/dynamic-cell-dropdown)|/logical/folder/stpname|A SAS service (STP or Viya Job) or a path to a SAS program on the filesystem. User provided values **must** be in this list. Cannot be used alongside a SOFTSELECT_HOOK.|
|[SOFTSELECT_HOOK](/dynamic-cell-dropdown)|/physical/path/program.sas|A SAS service (STP or Viya Job) or a path to a SAS program on the filesystem. User-provided values may (or may not) be in this list. Cannot be used alongside a HARDSELECT_HOOK.|
## Dropdowns
There are now actually FIVE places where you can configure dropdowns!
1. The [MPE_SELECTBOX](/dcc-selectbox/) table
2. The HARDSELECT validation (library.member.column reference)
3. The SOFTSELECT validation (library.member.column reference)
4. The HARDSELECT_HOOK validation (SAS Program)
5. The SOFTSELECT_HOOK validation (SAS Program)
How do these inter-operate?
Well - if you have values in MPE_SELECTBOX and/or HARDSELECT / SOFTSELECT tables, they will be merged together, and served in ADDITION to the values provided by any HOOK program.
Dropdowns are SOFT by default, unless a HARD rule is present.
Data Controller will not let you submit both a HARDSELECT_HOOK and a SOFTSELECT_HOOK on the same variable.

View File

@ -1,37 +0,0 @@
#Data Controller for SAS® - Backend Deployment
## Overview
The backend for Data Controller consists of a set of Stored Processes, a macro library, and a database. The database can be SAS Base library if desired, however this can cause contention (eg table locks) if end users are able to connect to the datasets directly, eg via Enterprise Guide or Base SAS.
## Regular Deployment
1 - Import `/sas/import.spk` using SAS Management Console. Make a note of the root location in which this was deployed - as this will be added to the `metadataRoot` value in the `h54sConfig.json` file in the [frontend](dci-frontend.md#details) deployment.
2 - Create a physical staging directory. This folder will contain the logs and CSV files generated by Users. The SAS Spawned Server account (eg `sassrv`) will need write access to this location.
3 - Register a library in metadata for the control database. The libref should be `DATACTRL`. If this is not possible, then an alternative libref can be used, simply specify it in the configuration component.
4 - Update the configuration component (imported in the SPK) with the following attributes:
* `dc_staging_area` - location of staging directory as per step 2
* `dc_admin_group` - enter the name of a metadata group (eg SASAdministrators) that should be given unrestricted access to the tool
* `dc_libref` - if you were unable to use the `DATACTRL` libref in step 3, then use the updated libref here
5 - Deploy the physical tables and register them in metadata. For this, simply compile and run the `mpe_build()` macro using an account with appropriate priviliges.
!!! note
Make sure the SAS Spawned Server account (eg `sassrv`) can access these tables!
The next step is to deploy the [frontend](dci-frontend.md).
## EUC Deployment
Optionally, a shared network drive can be configured to enable EUCs to temporarily stage CSVs for upload into the Data Controller review process.
For security, it is recommended to set permissions so that end users can write, but not read or modify. The SAS Spawned Server account (eg `sassrv`) will need read and modify access - as it will remove the files once they are loaded into the secure staging area.
## Deployment Diagram
An overview of how the components fit together is available below:
![deploymentdiagram](img/dci_deploymentdiagram.png)

185
docs/dci-deploysas9.md Normal file
View File

@ -0,0 +1,185 @@
---
layout: article
title: DC SAS 9 Deployment
description: How to deploy Data Controller in a production SAS 9 environment
og_image: https://docs.datacontroller.io/img/dci_deploymentdiagram.png
---
# SAS 9 Deployment
## Deployment Process
There are two ways to deploy Data Controller on SAS 9:
* Full Deployment (preferred)
* Streaming (for quick demos)
### Full Deployment
#### 1 - Deploy Stored Processes
The Stored Processes are deployed using a SAS Program. This should be executed using an account that has WRITE METADATA (WM) permissions to the necessary root folder (`appLoc`) in metadata.
```sas
%let appLoc=/Shared Data/apps/DataController; /* CHANGE THIS!! */
%let serverName=SASApp;
filename dc url "https://git.datacontroller.io/dc/dc/releases/download/latest/sas9.sas";
%inc dc;
```
If you don't have internet access from SAS, download `sas9.sas` from [here](https://git.datacontroller.io/dc/dc/releases), and change the initial `compiled_apploc` and `compiled_serverName` macro variable assignments as necessary.
#### 2 - Deploy the Frontend
The Data Controller frontend comes pre-built, and ready to deploy to the root of the SAS Web Server (mid-tier).
Deploy as follows:
1. Download the `frontend.zip` file from: [https://git.datacontroller.io/dc/dc/releases](https://git.datacontroller.io/dc/dc/releases)
2. Unzip and place in the [htdocs folder of your SAS Web Server](https://sasjs.io/frontend-deployment/#sas9-deploy) - typically a subdirectory of: `!SASCONFIG/LevX/Web/WebServer/htdocs`.
3. Open the `index.html` file and update the values in the `<sasjs>` tag as follows:
* `appLoc` - same as per SAS code in the section above
* `serverType` - change this to`SAS9`
* `serverUrl` - Provide only if your SAS Mid Tier is on a different domain than the web server (protocol://SASMIDTIERSERVER:port)
* `loginMechanism` - set to `Redirected` if using SSO or 2FA
* `debug` - set to `true` to debug issues on startup (otherwise it's faster to leave it off and turn on in the application itself when needed)
The remaining properties are not relevant for a SAS 9 deployment and can be **safely ignored**.
You can now open the app at `https://YOURWEBSERVER/unzippedfoldername` (step 2 above) and follow the configuration steps (DC Physical Location and Admin Group) to complete deployment.
#### 3 - Run the Configurator
When opening Data Controller for the first time, a configuration screen is presented. Be sure to log in with an account that has WRITE METADATA (WM) on the following metadata folders:
* `services/admin` - so the configurator STP can be deleted after being run
* `services/common` - so the `Data_Controller_Settings` STP can be updated
* `Data` - so the library and tables can be registered (using proc metalib)
There are two things to configure:
1. Path to the designated physical staging area. Make sure that the SAS Spawned Server account (eg `sassrv`) has WRITE access to this location.
2. Admin Group. ⚠️ Note that anyone in this group will have unrestricted access to Data Controller! ⚠️ "Unrestricted access" is provided by code logic. Post installation, Data Controller will never update nor modify metadata.
!!! note
If you do not see any groups, then it is possible your Stored Process is running from a different metadata repository to the location of your SAS users (eg Foundation). To fix this, update the `services/admin/configurator` STP with this code: `%let dc_repo_users=YOUUSERRMETAREPO;`
After you click submit, the Stored Process will run, configure the staging area and create the library tables (as datasets).
You will then be presented with three further links:
1. Refresh Data Catalog. Run this to scan all available datasets and update the catalog.
2. Refresh Table Metadata. Run this to update the table-level data lineage.
3. Launch. Currently this feature only works for streaming apps - just refresh the page for a full deployment.
#### 4 - Performance Enhancement
The most common performance bottlenecks (# of available connections, memory in each connection) can be addressed by the following (administrator) actions:
* Increasing the number of multibridge connections in SMC
* Increasing MEMSIZE (eg `-memsize 4G`) in the STP Options file
### Streaming
The streaming approach is optimised for rapid deployment, and works by bundling the frontend into metadata. This is a highly inefficient way to serve web content, and thus should only really be used for demos / evaluation purposes.
Deployment is very easy - just run the SAS code below (after changing the `appLoc`):
```sas
%let appLoc=/Shared Data/apps/DataController; /* CHANGE THIS!! */
filename dc url "https://git.datacontroller.io/dc/dc/releases/download/vX.X.X/demostream_sas9.sas"; /* use actual version number */
%inc dc;
```
If you don't have internet access from your SAS environment, just download `demostream_sas9.sas` from [https://git.datacontroller.io/dc/dc/releases](https://git.datacontroller.io/dc/dc/releases) and modify the `appLoc` on line 2, as follows:
![](img/sas9_apploc.png)
After that, continue to the configuration as described above.
## Deployment Diagram
A Full Deployment of Data Controller for SAS 9 consists of:
* Frontend on the web server
* Stored Processes (+ Library & Table definitions) in metadata
* Staging Area on the physical filesystem
* Database _or_ SAS Base library
The below areas of the SAS platform are modified when deploying Data Controller:
![](img/dci_deploymentdiagram.svg)
<!--img src="/img/dci_deploymentdiagram.svg" height="350" style="border:3px solid black" -->
### Client Device
Nothing needs to be deployed or modified on the client device. We support a wide range of browsers (the same as SAS). Browsers make requests to the SAS Web Server, and will cache assets such as JS, CSS and images. Some items (such as dropdowns) are kept in local storage to improve responsiveness.
### SAS Mid Tier
A single `index.html` file plus several CSS / JS / image files are served from a subfolder in the static content area SAS Web Server.
This is served up by the _existing_ SAS Web Server, no additional server (running) process is required.
If you are running more than one web server, you will need to deploy to them all.
### SAS Application Server
Given the enhanced permissions needed of the system account, a dedicated / secured STP instance is recommended as described [here](/dci-stpinstance).
All deployments of Data Controller also make use of a physical staging directory. This is used to store staged data, logs, plus CSV and Excel files as uploaded by end users. This directory should NOT be accessible by end users - only the SAS system account (eg `sassrv`) requires access to this directory.
A typical small deployment will grow by a 10-20 mb each month. A very large enterprise customer, with 100 or more editors, might generate up to 1 GB or so per month, depending on the size and frequency of the Excel EUCs and CSVs being uploaded. Web modifications are restricted only to modified rows, so are typically just a few kb in size.
### SAS Metadata Server
The items deployed to metadata include:
* Folder tree
* Stored Processes
* Library Object & tables
All SAS code is embedded in Stored Processes (so there is no need to deploy programs to the file system, no SASAUTOs). There is no use of X commands, no use of external internet access, full LOCKDOWN is supported.
After the installation process (which updates `public/settings` and removes the `admin/makedata` STP), there are no write actions performed against metadata.
### Databases
We strongly recommend that the Data Controller configuration tables are stored in a database for concurrency reasons.
We have customers in production using Oracle, Postgres, Netezza, Redshift and SQL Server to name a few. Contact us for support with DDL and migration steps for your chosen vendor.
!!! note
Data Controller does NOT modify schemas! It will not create or drop tables, or add/modify columns or attributes. Only data _values_ (not the model) can be modified using this tool.
To caveat the above - it is also quite common for customers to use a BASE engine library. Data Controller ships with mechananisms to handle locking (internally) but it cannot handle external contentions, such as those caused when end users open datasets directly, eg with Enterprise Guide or Base SAS.
## Redeployment
The full redeployment process is as follows:
* Back up metadata (export DC folder as SPK file)
* Back up the physical tables in the DC library
* Do a full deploy of a brand new instance of DC
- To a new metadata folder
- To a new frontend folder (if full deploy)
* _Delete_ the **new** DC library (metadata + physical tables)
* _Move_ the **old** DC library (metadata only) to the new DC metadata folder
* Copy the _content_ of the old `services/public/Data_Controller_Settings` STP to the new one
- This will link the new DC instance to the old DC library / logs directory
- It will also re-apply any site-specific DC mods
* Run any/all DB migrations between the old and new DC version
- See [migrations](https://git.datacontroller.io/dc/dc/src/branch/main/sas/sasjs/db/migrations) folder
* Test and make sure the new instance works as expected
* Delete (or rename) the **old** instance
- Metadata + frontend, NOT the underlying DC library data
* Rename the new instance so it is the same as the old
- Both frontend and metadata
* Run a smoke test to be sure everything works!
If you are unfamiliar with, or unsure about, the above steps - don't hesitate to contact the Data Controller team for assistance and support.

185
docs/dci-deploysasviya.md Normal file
View File

@ -0,0 +1,185 @@
---
layout: article
title: DC SAS Viya Deployment
description: How to deploy Data Controller in a production SAS Viya environment
og_image: https://docs.datacontroller.io/img/dci_deploymentdiagramviya.png
---
# SAS Viya Deployment
## Overview
Data Controller for SAS Viya consists of a frontend, a set of Job Execution Services, a staging area, a Compute Context, and a database library. The library can be a SAS Base engine if desired, however this can cause contention (eg table locks) if end users are able to connect to the datasets directly, eg via Enterprise Guide or Base SAS.
A database that supports concurrent access is highly recommended.
## Prerequisites
### System Account
Data Controller makes use of a system account for performing backend data updates and writing to the staging area. This needs to be provisioned in advance using the Viya admin-cli. The process is well described here: [https://communities.sas.com/t5/SAS-Communities-Library/SAS-Viya-3-5-Compute-Server-Service-Accounts/ta-p/620992](https://communities.sas.com/t5/SAS-Communities-Library/SAS-Viya-3-5-Compute-Server-Service-Accounts/ta-p/620992)
### Database
Whilst we do recommend that Data Controller configuration tables are stored in a database for concurrency reasons, it is also possible to use a BASE engine library, which is adequate if you only have a few users.
To migrate the control library to a database, first perform a regular deployment, and afterwards you can generate the DDL and update the settings file..
Make sure the system account (see above) has full read / write access.
!!! note
"Modify schema" privileges are not required.
### Staging Directory
All deployments of Data Controller make use of a physical staging directory. This is used to store logs, as well as CSV and Excel files uploaded by end users. This directory should NOT be accessible by end users - only the SAS system account requires access to this directory.
A typical small deployment will grow by a 5-10 mb each month. A very large enterprise customer, with 100 or more editors, might generate up to 0.5 GB or so per month, depending on the size and frequency of the Excel EUCs and CSVs being uploaded. Web modifications are restricted only to modified rows, so are typically just a few kb in size.
## Deployment Diagram
The below areas of the SAS Viya platform are modified when deploying Data Controller:
<img src="/img/dci_deploymentdiagramviya.svg" height="350" style="border:3px solid black" >
## Deployment
Data Controller deployment is split between 3 deployment types:
* Demo version
* Full Version (manual deploy)
* Full Version (automated deploy)
<!--
## Full Version - Manual Deploy
-->
There are several parts to this proces:
1. Create the Compute Context
2. Deploy Frontend
4. Prepare the database and update settings (optional)
5. Update the Compute Context autoexec
### Create Compute Context
The Viya Compute context is used to spawn the Job Execution Services - such that those services may run under the specified system account, with a particular autoexec.
We strongly recommend a dedicated compute context for running Data Controller. The setup requires an Administrator account.
* Log onto SASEnvironment Manager, select Contexts, View Compute Contexts, and click the Create icon.
* In the New Compute Context dialog, enter the following attributes:
* Context Name
* Launcher Context
* Attribute pairs:
* reuseServerProcesses: true
* runServerAs: {{the account set up [earlier](#system-account)}}
* Save and exit
!!! note
XCMD is NOT required to use Data Controller.
### Deploy frontend
Unzip the frontend into your chosen directory (eg `/var/www/html/DataController`) on the SAS Web Server. Open `index.html` and update the following inside `dcAdapterSettings`:
- `appLoc` - this should point to the root folder on SAS Drive where you would like the Job Execution services to be created. This folder should initially, NOT exist (if it is found, the backend will not be deployed)
- `contextName` - here you should put the name of the compute context you created in the previous step.
- `dcPath` - the physical location on the filesystem to be used for staged data. This is only used at deployment time, it can be configured later in `$(appLoc)/services/settings.sas` or in the autoexec if used.
- `adminGroup` - the name of an existing group, which should have unrestricted access to Data Controller. This is only used at deployment time, it can be configured later in `$(appLoc)/services/settings.sas` or in the autoexec if used.
- `servertype` - should be SASVIYA
- `debug` - can stay as `false` for performance, but could be switched to `true` for debugging startup issues
- `useComputeApi` - use `true` for best performance.
![Updating index.html](img/viyadeployindexhtml.png)
Now, open https://YOURSERVER/DataController (using whichever subfolder you deployed to above) using an account that has the SAS privileges to write to the `appLoc` location.
You will be presented with a deployment screen like the one below. Be sure to check the "Recreate Database" option and then click the "Deploy" button.
![viya deploy](img/viyadeployauto.png)
Your services are deployed! And the app is operational, albeit still a little sluggish, as every single request is using the APIs to fetch the content of the `$(appLoc)/services/settings.sas` file.
To improve responsiveness by another 700ms we recommend you follow the steps in [Update Compute Context Autoexec](/dci-deploysasviya/#update-compute-context-autoexec) below.
### Deploy Database
If you have a lot of users, such that concurrency (locked datasets) becomes an issue, you might consider migrating the control library to a database.
The first part to this is generating the DDL (and inserts). For this, use the DDL exporter as described [here](/admin-services/#export-database). If you need a flavour of DDL that is not yet supported, [contact us](https://datacontroller.io/contact/).
Step 2 is simply to run this DDL in your preferred database.
Step 3 is to update the library definition in the `$(appLoc)/services/settings.sas` file using SAS Studio.
### Update Compute Context Autoexec
First, open the `$(appLoc)/services/settings.sas` file in SAS Studio, and copy the code.
Then, open SASEnvironment Manager, select Contexts, View Compute Contexts, and open the context we created earlier.
Switch to the Advanced tab and paste in the SAS code copied from SAS Studio above.
It will look similar to:
```
%let DC_LIBREF=DCDBVIYA;
%let DC_ADMIN_GROUP={{YOUR DC ADMIN GROUP}};
%let DC_STAGING_AREA={{YOUR DEDICATED FILE SYSTEM DRIVE}};
libname &dc_libref {{YOUR DC DATABASE}};
```
To explain each of these lines:
* `DC_LIBREF` can be any valid 8 character libref.
* `DC_ADMIN_GROUP` is the name of the group which will have unrestricted access to Data Controller
* `DC_STAGING_AREA` should point to the location on the filesystem where the staging files and logs are be stored
* The final libname statement can also be configured to point at a database instead of a BASE engine directory (contact us for DDL)
If you have additional libraries that you would like to use in Data Controller, they should also be defined here.
<!--
## Full Version - Automated Deploy
The automated deploy makes use of the SASjs CLI to create the dependent context and job execution services. In addition to the standard prerequisites (a registered viya system account and a prepared database) you will also need:
* a local copy of the [SASjs CLI](https://sasjs.io/sasjs-cli/#installation)
* a Client / Secret - with an administrator group in SCOPE, and an authorization_code GRANT_TYPE. The SASjs [Viya Token Generator](https://github.com/sasjs/viyatoken) may help with this.
### Prepare the Target and Token
To configure this part (one time, manual step), we need to run a single command:
```
sasjs add
```
A sequence of command line prompts will follow for defining the target. These prompts are described [here](https://sasjs.io/sasjs-cli-add/). Note that `appLoc` is the SAS Drive location in which the Data Controller jobs will be deployed.
### Prepare the Context JSON
This file describes the context that the CI/CD process will generate. Save this file, eg as `myContext.json`.
```
{
"name": "DataControllerContext",
"attributes": {
"reuseServerProcesses": true,
"runServerAs": "mycasaccount"
},
"environment": {
"autoExecLines": [
"%let DC_LIBREF=DCDBVIYA;",
"%let DC_ADMIN_GROUP={{YOUR DC ADMIN GROUP}};",
"%let DC_STAGING_AREA={{YOUR DEDICATED FILE SYSTEM DRIVE}};",
"libname &dc_libref {{YOUR DC DATABASE}};",
],
"options": []
},
"launchContext": {
"contextName": "SAS Job Execution launcher context"
},
"launchType": "service",
}
```
### Prepare Deployment Script
The deployment script will run on a build server (or local desktop) and execute as follows:
```
# Create the SAS Viya Target
sasjs context create --source myContext.json --target myTarget
```
-->

View File

@ -10,14 +10,14 @@ A free version of Data Controller is available for evaluation purposes. Compiled
### Deployment ### Deployment
#### Import #### Import
Simply import the SPK (using SAS Management Console or Data Integration Studio) to the desired location in the metadata tree. During the import (step 5 of the wizard), be sure to change the location of the library (BASE engine) to a **directory folder** to which the Stored Process system account (eg `sassrv`) has **write access**. Simply import the SPK (using SAS Management Console or Data Integration Studio) to the desired location in the metadata tree. During the import (step 5 of the wizard), be sure to change the location of the library (BASE engine) to a physical **directory folder** to which the Stored Process system account (eg `sassrv`) has **write access**.
#### Permissions #### Permissions
Be sure that the user account you will use in the [configuration(#Configuration) step below has WRITE METADATA (WM) on the `/DataController/Admin` and `/DataController/Data` folders, and that anyone who will use the app has READ. Be sure that the user account you will use in the [configuration](#configuration) step below has WRITE METADATA (WM) on the `/DataController/services/admin` and `/DataController/Data` folders, and that anyone who will use the app has READ.
### Configuration ### Configuration
Navigate to the web application (eg `https://[YOURHOST]/SASStoredProcess?_action=1063`) and find the location where the app was imported. Then run the `/DataController/Admin/configurator` stored process. Navigate to the web application (eg `https://[YOURHOST]/SASStoredProcess?_action=1063`) and find the location where the app was imported. Then run the `DataController/services/admin/configurator` stored process.
!!! note !!! note
Use the same user account as you used to import the SPK, to avoid metadata permissions issues! This may mean logging out / logging back in to the web application. Use the same user account as you used to import the SPK, to avoid metadata permissions issues! This may mean logging out / logging back in to the web application.
@ -26,10 +26,12 @@ Navigate to the web application (eg `https://[YOURHOST]/SASStoredProcess?_action
This displays a screen with a choice of SAS Metadata Groups (to which your account belongs) can be chosen. Selecting any of these groups will build / rebuild all the configuration tables (placing logs in a subfolder of the previously configured library location) and provide the chosen group with **unrestricted** access to the tool. This displays a screen with a choice of SAS Metadata Groups (to which your account belongs) can be chosen. Selecting any of these groups will build / rebuild all the configuration tables (placing logs in a subfolder of the previously configured library location) and provide the chosen group with **unrestricted** access to the tool.
If you do not see any groups, then it is possible your Stored Process is running from a different metadata repository to the location of your SAS users (eg Foundation). To fix this, re-run the configuration stp with the `&dc_repo_users=YOURMETAREPO` url parameter.
![evaltree](img/dci_evalconfig.png) ![evaltree](img/dci_evalconfig.png)
!!! note !!! note
"Unrestricted access" is provided by code logic. Once installed, Data Controller does not ever update or modify metadata. During installation, the services in the `/Admin` folder are updated (configuration) or removed (to prevent accidental reinstall). Also the tables are registered in the `/Data` folder using `proc metalib`. "Unrestricted access" is provided by code logic. Once installed, Data Controller does not ever update or modify metadata. During installation, the services in the `/services/admin` folder are updated (configuration) or removed (to prevent accidental reinstall). Also the tables are registered in the `/Data` folder using `proc metalib`.
## Usage ## Usage
@ -46,8 +48,10 @@ The demo version has been optimised for a rapid install, and should not be consi
2) Requires BASE engine for config tables, with high risk of table locks 2) Requires BASE engine for config tables, with high risk of table locks
3) Not licenced for commercial (or production) use, and not supported 3) Interface is not licenced for commercial (or production) use, and not supported
4) The embedded HandsOnTable library is not licenced for commercial use without a licence key 4) Underlying macros are not licensed for re-use on other (internal) projects
Contact Macro People support for a full-featured, fully licenced, scalable and supported deployment of Data Controller at your earliest convenience! 5) The embedded HandsOnTable library is not licenced for commercial use without a licence key
[Contact us](https://datacontroller.io/contact) for a full-featured, fully licenced, scalable and supported deployment of Data Controller at your earliest convenience!

View File

@ -1,15 +0,0 @@
# Data Controller for SAS® - Frontend Deployment
## Overview
The Data Controller front end comes pre-built, and ready to deploy to the root of the SAS Web Server (mid-tier), typically `htdocs`.
## Instructions
1 - Unzip dcfrontend.zip and upload the entire `datacontroller` directory to the static content server.
2 - Open the `h54s.config` file and update the `metadataRoot` value to the location of the Stored Processes as per [backend](dci-backend.md) deployment. Remember to include the trailing slash (`/`).
It should now be possible to use the application - simply navigate to `YO
URSASWEBLOC.domain/yourRoot/datacontroller` and sign in!
The next step is to [configure](dcc-tables.md) the tables.

View File

@ -1,14 +1,18 @@
# Data Controller for SAS® - System Requirements # Data Controller for SAS® - System Requirements
## Overview ## Overview
The Data Controller is a SAS Stored Process Web Application, deployed into an existing SAS platform, and as such has no special requirements beyond what is typically available in a SAS Foundation environment. The Data Controller is a SAS Web Application, deployed into an existing SAS platform, and as such has no special requirements beyond what is typically available in a SAS Foundation or Viya environment.
## Backend ## SAS 9
### Backend
A SAS Foundation deployment of at least 9.4M3 must be available. Earlier versions of SAS can be supported, on request. A SAS Stored Process Server must be configured, running under a system account. A SAS Foundation deployment of at least 9.4M3 must be available. Earlier versions of SAS can be supported, on request. A SAS Stored Process Server must be configured, running under a system account.
## Mid-Tier ### Mid-Tier
A web server with `/SASLogon` and the SAS SPWA must be available to end users A web server with `/SASLogon` and the SAS SPWA must be available to end users
## SAS Viya
A minimum of Viya 3.5 is recommended to make use of the ability to run a shared compute instance.
## Frontend ## Frontend
All major browsers supported, including IE11 (earlier versions of IE may not work properly). All major browsers supported, including IE11 (earlier versions of IE may not work properly).
For IE, note that [compatibility view](dci-troubleshooting#Internet Explorer - blank screen) must be disabled. For IE, note that [compatibility view](dci-troubleshooting#Internet Explorer - blank screen) must be disabled.

137
docs/dci-stpinstance.md Normal file
View File

@ -0,0 +1,137 @@
# Data Controller for SAS® - Stored Process Server
## Overview
Data Controller requires that the operating system account (eg sassrv) has the ability to WRITE to each of the libraries set up for editing. For SAS installations where business users have the unrestricted ability to create Stored Processes in production, this can represent a security risk.
Under these circumstances, it is recommended to create a dedicated STP server instance for Data Controller, with a dedicated system account.
!!! note
Data Controller only updates data (add, delete, modify records). It does not need the ability to create new (permanent) tables, or modify the structure of existing tables.
## Set up DC account
It is recommended to have a user for each environment in which DC is deployed, eg:
* dcsrv_dev
* dcsrv_test
* dcsrv_prod
After these OS users are created, log into SMC in relevant environment and open User Manager. Adjust as follows:
* Open SAS General Servers group
* Select Accounts tab
* Add the dcsrv_[ENV] user in DefaultAuth domain
## STP Server Configuration - 9.4
Open the SAS Deployment Wizard and deploy a new Application Context Server from the panel windows.
Be sure to use the relevant dcsrv_[env] user as configured above.
Now head to the [security](#security) section.
## STP Server Configuration - 9.3
As the wizard does not exist in 9.3 it is necessary to copy the folder structure.
### Clone existing directory
Navigate to the SASApp directory on relevant machine (eg `!SASCONFIG/Lev1/SASApp`) and make a copy of the StoredProcessServer folder, and rename it (eg DataControllerSTPsvr).
Modify the contents of the new folder as follows:
* Autoexec (and usermods) adjust content to ensure it is relevant to a DC context
* sasv9_usermods.cfg suggested items:
```
- memsize 0
- UTILLOC “/change/only/if/needed”
- logconfigloc "location of DataControllerSTPsvr logconfig.xml file (in new folder)"
```
The following files should have all instances of “\StoredProcessServer\” replaced
with “\DataControllerSTPsvr\”:
* Logconfig.xml
* Logconfig.trace.xml
* StoredProcessServer.bat
* Logconfig.apm.xml
* Sasv9.cfg
* Dtest folder we dont believe this is used but make the changes anyway (same as
above, change all files within it to swap “storedprocessserver” for
DataControllerSTPsvr
* Sasuser folder EMPTY CONTENTS (remove all files). They arent relevant in the
data controller context.
### Add Server
Open ServerManager and adjust as follows:
* Log into SMC in relevant environment
* Open ServerManager
* Right click / new server
* Select Application Server
* Name as “SAS_App_DataController”
* Click Next / select “Stored Process Server” / Next
* Select “Custom” / Next
* Command = `“C:\SAS92\Config\Lev1\SASApp\SASDataEditorStoredProcessServer\StoredProcessServe
r.bat”` (adjust as appropriate)
* Object server parameters = empty
* Multiuser - select dcsrv_[Env]
* Choose SASApp server machine (put in RH box)
* Next / Bridge Connection(default) / Next
* Bridge Port: 8602
* Add / Single Port / 8612
* Add / Single Port / 8622
* Add / Single Port / 8632
* Add at least NINE connections, up to a maximum of (5 per CPU core).
* Next / finish
Next, refresh Server Manager to see the new SAS_App_DataController server. Expand and adjust as follows:
* Right click SAS_App_DataController-Logical server (first nest), properties, Load Balancing tab, select “Response Time”
- Availability timeout 10 seconds
- Ok / exit
* Right click SAS_App_DataController Stored Process (second nest), properties, options
tab, Advanced options, Load Balancing
- Max clients 1
- Launch timeout 10 seconds
- Recycle activation limit 1
* Right click Object Spawner (inside Server Manager) / Properties / Servers, and add the new
Data Controller STP from “Available Servers” to “Selected Servers”
* Bounce the object spawner
#### VALIDATION (windows)
* Open command prompt as an administrator, and run : `netstat aon | find /I “8602”` (this will check if the new server is listening on the relevant port)
* Execute the .bat file to ensure a base sas session can be created in the relevant context (`“!SASConfig\Lev1\SASApp\SASDataEditorStoredProcessServer\StoredProcessServer.bat”`)
* In SMC (server manager), right click / validate the new server & test the connection
## Security
### STP Server Context
To protect the new STP server context, the following initialisation code must be added.
This code contains:
```
data _null_;
if !('/APPROVED/DC/FOLDER/LOCATION'=:symget('_program')) then do;
file _webout;
put 'Access to this location has not been approved';
put 'This incident may be reported';
abort cancel;
end;
run;
```
Save this program in the `DataControllerSTPsvr` folder. Then open Server Manager in SMC and expand SAS_App_DataController server. Right click SAS_App_DataController-Logical server (first nest), properties, Options tab,Set Server Properties, Request.
The `init program` value should be set to the location of the program above.

View File

@ -1,3 +1,10 @@
---
layout: article
title: Troubleshooting
description: Descriptions of common issues when working with Data Controller, and steps for resolution.
og_image: https://docs.datacontroller.io/img/cannotimport.png
---
# Data Controller for SAS® - Troubleshooting # Data Controller for SAS® - Troubleshooting
## Overview ## Overview
@ -24,7 +31,7 @@ The imported version of Data Controller is set up to work with the Stored Proces
``` ```
/* get the macros (or download / %include seperately) */ /* get the macros (or download / %include seperately) */
filename mc url "https://raw.githubusercontent.com/macropeople/macrocore/master/macrocore.sas"; filename mc url "https://raw.githubusercontent.com/sasjs/core/main/all.sas";
%inc mc; %inc mc;
/* put the path to your Data Controller folder here */ /* put the path to your Data Controller folder here */
@ -48,3 +55,51 @@ run;
/* run the program */ /* run the program */
%inc tmp; %inc tmp;
``` ```
## Custom Library
If you wish to change the default *libref* or *libname* then there are TWO items to configure:
1) The library itself
2) The `mpelib` macro variable and the libname statement in the `/Admin/Data_Controller_Settings` stored process.
!!! note
Be sure to make this change *after* running the configurator, to ensure the tables are first registered!
## Permission is needed to access the ServerContext Object
After a successful install, your business user may see the following message:
![Permission is needed to access the ServerContext object attached to the stored process.](img/error_obtaining_stp.png)
> Error obtaining stored process from repository
>
> Permission is needed to access the ServerContext object attached to the stored process.
The reason is that the context chosen when importing the SPK (perhaps, SASApp) is not available to your business user. It's likely you have multiple contexts.
The SPK must be re-imported with the correct context chosen. This may require regenerating the tables, or adjusting the permissions, if the new context uses a different system account.
## Stored Processes Cannot Be Imported Into A Project Repository
During the SPK import on a SAS 9 instance you may see the following dialog:
![Stored processes cannot be imported into a project repository](img/cannotimport.png)
> Stored processes cannot be imported into a project repository
This can happen when importing with Data Integration Studio and your user profile is making use of a personal project repository. Try re-connecting with the Foundation repository, or import with SAS Management Console (which does not support project repositories).
## There is no LogicalServer of the type requested associated with the ServerContext in metadata.
This can happen if you enter the wrong `serverName` when deploying the SAS program on an EBI platform. Make sure it matches an existing Stored Process Server Context.
The error may also be thrown due to an encoding issue - changing to a UTF-8 server has helped at least one customer.
## Determining Application Version
The app version is bundled into the frontend during the release, and is visible by clicking your username in the top right.
You can also determine the app version (and SASjs Version, and build time) by opening browser Development Tools and running `appinfo()` in the console.

33
docs/dcu-datacatalog.md Normal file
View File

@ -0,0 +1,33 @@
# Data Controller for SAS: Data Catalog
Data Controller collects information about the size and shape of the tables and columns. The Catalog does not contain information about the data content (values).
The catalog is based primarily on the existing SAS dictionary tables, augmented with attributes such as primary key fields, filesize / libsize, and number of observations (eg for database tables).
Frequently changing data (such as nobs, size) are stored on the MPE_DATASTATUS_XXX tables. The rest is stored on the MPE_DATACATALOG_XXX tables.
## Tables
### Libraries
This table contains library level attributes to provide a high level overview of data coverage. Note that unless you are an administrator, you are unlikely to have the ability to view / open all of these libraries. To avoid errors when opening invalid libraries, you can add pipe-separated LIBREFs to the DCXXXX.MPE_CONFIG table (var_scope='DC_CATALOG', var_name='DC_IGNORELIBS').
### Tables
Table attributes are split between those that change infrequently (eg PK_FIELDS) and those that change often (eg size, modified date, and NOBS).
### Variables
Variable attributes come from dictionary tables with an extra PK indicator. A PK is identified by the fact the variable is within an index that is both UNIQUE and NOTNULL. Variable names are always uppercase.
## Assumptions
The following assumptions are made:
* Data _Models_ (eg attributes) are not sensitive. If so the catalog tables should be disabled.
* Users can see all tables in the libraries they can access. The refresh process will close out any tables that are not found, if the user can see at least one table in a library.
* For a particular site, libraries are unique on LIBREF.
If you have duplicate librefs, specific table security setups, or sensitive models - contact us.

View File

@ -1,27 +0,0 @@
# Data Controller for SAS: File Uploads
Files can be uploaded via the Editor interface - first choose the library and table, then click "Upload". Currently only CSV files are supported, although these can be provided with non standard delimiters (such as semicolon).
<img src="/img/dcu-files1.png" height="350" style="border:3px solid black" >
The following should be considered when uploading data in this way:
- A header row (with variable names) is required
- Variable names must match the target (not case sensitive). An easy way to ensure this is to download the data from Viewer and use this as a template.
- Duplicate variable names are not permitted
- Missing columns are not permitted
- Additional columns are ignored
- The order of variables does not matter
- The delimiter is extracted from the header row - so for `var1;var2;var3` the delimeter would be assumed to be a semicolon
- The above assumes the delimiter is the first special character! So `var,1;var2;var3` would fail
- The following characters should not be used as delimiters
- doublequote
- quote
- space
- underscore
When loading dates, be aware that the data controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats.
This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`) to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings.
!!! tip
To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab

30
docs/dcu-lineage.md Normal file
View File

@ -0,0 +1,30 @@
# Data Lineage
The Data Lineage feature is available for SAS 9 installs. The implementation differs depending on whether the lineage is table level or column level.
## Table Level lineage
Table level lineage is relatively straightforward, and so it is extracted in a single ad-hoc `proc metadata` call and stored in the `DATACTRL.MPE_LINEAGE_TABS` table. To trigger the population (or refresh) of this table, simply execute the `YOURSERVER/SASStoredProcess/?_program={appLoc}/DataController/admin/refreshtablelineage` service from a browser.
![jobmetadata](img/dcu-jobmeta.png)
This data is stored with SCD2 so it is possible to track changes to lineage over time.
When users execute table level lineage, queries are made against this table, so there is very little metadata impact.
## Column Level lineage
Column level lineage is more complex as it also includes all the different transforms, and calculation logic along the way. For this reason it is performed at runtime, which means the initial request can take some time if there is a lot of lineage.
After the first request, subsequent lineage requests (for that particular column and direction) are cached in the `DATACTRL.MPE_LINEAGE_COLS` table for faster response times.
If the job is changed and a new diagram is needed, the user can click the 'refresh' checkbox.
## Export Types
Both Table and column level lineage pages allow the following export formats:
* SVG - high res digram format
* PNG - image format
* DOT - the graphviz language format used to generate the diagram
* CSV - a download of all the sources and targets in the diagram

View File

@ -1,15 +1,42 @@
# Data Controller for SAS: User Guide # Data Controller for SAS: Viewer
## Viewer
The viewer screen provides a raw view of the underlying table. The viewer screen provides a raw view of the underlying table.
Choose a library, then a table, and click view to see the first 5000 rows. Choose a library, then a table, and click view to see the first 5000 rows.
A filter option is provided should you wish to view a different section of rows. A filter option is provided should you wish to view a different section of rows.
The Download button gives three options for obtaining the current view of data: The following libraries will be visible:
* All libraries available on startup (session autoexec)
* Any libraries configured in the `services/public/[Data_Controller_Settings/settings]` Stored Process / Viya Job
* All libraries available to the logged in user in metadata (SAS 9 only)
Row and Column level security can also be applied in VIEW mode, as can additional table-level permissions (MPE_SECURITY table).
## Full Table Search
A single search box can be used to make a full table search on any character or numeric value, using this [macro](https://core.sasjs.io/mp__searchdata_8sas.html).
<iframe width="560" height="315" src="https://www.youtube.com/embed/i27w-xq85WQ" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Options
This button shows a range of options. If the table is editable, you will also see a EDIT option.
### Download
The Download button gives several options for obtaining the current view of data:
1) CSV. This provides a comma delimited file. 1) CSV. This provides a comma delimited file.
2) Excel. This provides a tab delimited file. 2) Excel. This provides a tab delimited file.
3) SAS. This provides a SAS program with data as datalines, so that the data can be rebuilt as a SAS table.
3) SAS Datalines. This provides a SAS program with data as datalines, so that the data can be rebuilt as a SAS table.
4) SAS DDL. A download of a DDL file using SAS flavoured syntax.
5) TSQL DDL. A DDL download using SQL Server flavoured syntax.
Note - if the table is registered in Data Controller as being TXTEMPORAL (SCD2) then the download option will prefilter for the _current_ records and removes the valid from / valid to variables. This makes the CSV suitable for DC file upload, if desired. Note - if the table is registered in Data Controller as being TXTEMPORAL (SCD2) then the download option will prefilter for the _current_ records and removes the valid from / valid to variables. This makes the CSV suitable for DC file upload, if desired.
### Web Query URL
This option gives you a URL that can be used to import data directly into third party tools such as Power BI or Microsoft Excel (as a "web query"). You can set up a filter, eg for a particular month, and refresh the query on demand using client tooling such as VBA.

View File

@ -0,0 +1,186 @@
---
layout: article
title: Dynamic Cell Dropdown
description: Configure SAS programs to determine exactly which values can appear within which cells in your Data Controller table!
og_image: https://docs.datacontroller.io/img/cell_validation1.png
---
# Dynamic Cell Dropdown
This is a simple, but incredibly powerful feature! Configure a SAS process to run when clicking a particular cell. Data Controller will send the *row* to SAS, and your SAS program can use the values in the row determine a *column* of values to send back - which will be used in the frontend selectbox.
So if you'd like the user to only see products for a particular category, or ISIN's for a particular asset group, you can perform that easily.
This feature is used extensively in Data Controller to fetch tables specific to a library, or columns specific to a table:
![](img/cell_validation1.png)
You can also use the response to populate _other_ dropdowns (also in the same row) in the same request - these are called 'extended validations'.
<iframe width="560" height="315" src="https://www.youtube.com/embed/rmES77aIr90" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Frontend Configuration
Open the MPE_VALIDATIONS table and configure the library, table and column that should contain the selectbox. In the RULE_TYPE column, enter either:
* HARDSELECT_HOOK - The user entry MUST match the returned values
* SOFTSELECT_HOOK - The user can view the list but type something else if they wish
The RULE_VALUE column should contain the full path to the SAS Program, Viya Job or SAS 9 Stored process that you would like to execute. If the value ends in ".sas" then it is assumed to be a SAS program on a directory, otherwise a SAS web service (STP or Viya Job).
## Backend Configuration
If creating a Stored Process, be sure to deselect the 'automatic SAS macros' - the presence of %stpbegin or %stpend autocall macros will cause problems with the Data Controller backend.
You can write any SAS code you wish. For examples of hook scripts you can look at the Data Controller internal validation programs (listed in the MPE_VALIDATIONS table). You will receive the following as inputs:
* `work.source_row` -> A dataset containing the **current row** being modified in Data Controller. This will have already been created in the current SAS session. All variables are available. Use this to filter the initial values in `work.dynamic_values`.
* `&DC_LIBREF` -> The DC control library
* `&LIBDS` - The library.dataset being filtered
* `&VARIABLE_NM` - The column for which to supply the validation
The following tables should be created in the WORK library as outputs:
* `work.dynamic_values`
* `work.dynamic_extended_values` (optional)
### `WORK.DYNAMIC_VALUES`
This output table can contain up to three columns:
* `display_index` (optional, mandatory if using `dynamic_extended_values`). Is a numeric key used to join the two tables.
* `display_value` - always character
* `raw_value` - unformatted character or numeric according to source data type
Example values:
|DISPLAY_INDEX:best.|DISPLAY_VALUE:$|RAW_VALUE|
|---|---|---|
|1|$77.43|77.43|
|2|$88.43|88.43|
### `WORK.DYNAMIC_EXTENDED_VALUES`
This output table is optional. If provided, it will map the DISPLAY_INDEX from the DYNAMIC_VALUES table to additional column/value pairs, that will be used to populate dropdowns for _other_ cells in the _same_ row.
The following columns should be provided:
* `display_index` - a numeric key joining each value to the `dynamic_values` table
* `extra_col_name` - the name of the additional variable(s) to contain the extra dropdown(s)
* `display_value` - the value to display in the dropdown. Always character.
* `display_type` - Either C or N depending on the raw value type
* `raw_value_num` - The unformatted value if numeric
* `raw_value_char` - The unformatted value if character
* `forced_value` - set to 1 to force this value to be automatically selected when the source value is changed. If anything else but 1, the dropdown will still appear, but the user must manually make the selection.
Example Values:
|DISPLAY_INDEX:best.|EXTRA_COL_NAME:$32|DISPLAY_VALUE:$|DISPLAY_TYPE:$1.|RAW_VALUE_NUM|RAW_VALUE_CHAR:$5000|FORCED_VALUE|
|---|---|---|---|---|---|---|
|1|DISCOUNT_RT|"50%"|N|0.5||.|
|1|DISCOUNT_RT|"40%"|N|0.4||0|
|1|DISCOUNT_RT|"30%"|N|0.3||1|
|1|CURRENCY_SYMBOL|"GBP"|C||"GBP"|.|
|1|CURRENCY_SYMBOL|"RSD"|C||"RSD"|.|
|2|DISCOUNT_RT|"50%"|N|0.5||.|
|2|DISCOUNT_RT|"40%"|N|0.4||1|
|2|CURRENCY_SYMBOL|"EUR"|C||"EUR"|.|
|2|CURRENCY_SYMBOL|"HKD"|C||"HKD"|1|
### Code Examples
Simple dropdown
```sas
/**
@file
@brief Simple dynamic cell dropdown for product code
@details The input table is simply one row from the
target table called "work.source_row".
Available macro variables:
@li DC_LIBREF - The DC control library
@li LIBDS - The library.dataset being filtered
@li VARIABLE_NM - The column being filtered
<h4> Service Outputs </h4>
Output should be a single table called
"work.dynamic_values" in the format below.
|DISPLAY_VALUE:$|RAW_VALUE:??|
|---|---|
|$44.00|44|
**/
%dc_assignlib(READ,mylibref)
proc sql;
create table work.DYNAMIC_VALUES as
select distinct some_product as raw_value
from mylibref.my_other_table
where area in (select area from work.source_row)
order by 1;
```
Extended dropdown
```sas
proc sql;
create table work.source as
select libref,dsn
from &DC_LIBREF..MPE_TABLES
where tx_to > "%sysfunc(datetime(),E8601DT26.6)"dt
order by 1,2;
data work.DYNAMIC_VALUES (keep=display_index raw_value display_value);
set work.source end=last;
by libref;
if last.libref then do;
display_index+1;
raw_value=libref;
display_value=libref;
output;
end;
if last then do;
display_index+1;
raw_value='*ALL*';
display_value='*ALL*';
output;
end;
run;
data work.dynamic_extended_values(keep=display_index extra_col_name display_type
display_value RAW_VALUE_CHAR raw_value_num forced_value);
set work.source end=last;
by libref dsn;
retain extra_col_name 'ALERT_DS';
retain display_type 'C';
retain raw_value_num .;
raw_value_char=dsn;
display_value=dsn;
forced_value=0;
if first.libref then display_index+1;
if last.libref then do;
display_value='*ALL*';
raw_value_char='*ALL*';
forced_value=1;
output;
end;
else output;
if last then do;
display_value='*ALL*';
raw_value_char='*ALL*';
forced_value=1;
output;
end;
run;
```
## Technical Notes
When first clicking on a 'dynamic dropdown' cell, the frontend will first hash the entire row, and store the subsequent response from SAS against this hash in an internal lookup table. In this way, the lookup table can be subsequently referenced to vastly improve performance (by avoiding unnecessary server requests).
The lookup event will occur immediately upon clicking on the (dynamic dropdown) cell. If the row has not changed since the previous click, the response will be instant. If any value in the row HAS changed, and that particular combination of values has not previously been requested (in the same browser session), then a request to SAS will need to take place before the dropdown values are shown.

View File

@ -1,4 +1,5 @@
# Data Controller for SAS® - Emails Data Controller for SAS® - Emails
====================
## Overview ## Overview
Data Controller enables email alerts for users when tables are: Data Controller enables email alerts for users when tables are:
@ -11,6 +12,8 @@ Emails are sent after any post edit / post approve hooks. They can be sent when
Email addresses are looked for first in `DCXXXXXX.MPE_EMAILS`. If they are not found there, then a metadata search is made (the first email found in metadata for that user is used). Email addresses are looked for first in `DCXXXXXX.MPE_EMAILS`. If they are not found there, then a metadata search is made (the first email found in metadata for that user is used).
<iframe src="https://player.vimeo.com/video/343401440" width="640" height="360" frameborder="0" allow="autoplay; fullscreen" allowfullscreen></iframe>
## Setup ## Setup
As not every site has emails configured, this feature is switched OFF by default. As not every site has emails configured, this feature is switched OFF by default.
To switch it on, navigate to `DCXXXXXX.MPE_CONFIG` and set the value for `DC_EMAIL_ALERTS` to be `YES` (uppercase). To switch it on, navigate to `DCXXXXXX.MPE_CONFIG` and set the value for `DC_EMAIL_ALERTS` to be `YES` (uppercase).

View File

@ -0,0 +1,51 @@
Data Controller for SAS® Evaluation Agreement
====================
The terms and conditions contained below constitute a legal agreement. This agreement ("Agreement") contains herein the entire agreement between the licensee ("Licensee") and Bowe IO Ltd. Read this agreement carefully. By downloading, installing, and/or examining the product, you acknowledge:
1 - You are authorized to enter this agreement for and on behalf of your company, and are doing so, and 2 - You have read, understand and agree that you and the company shall be bound by these terms and conditions and every modification and addition provided for.
Software products included with this product that are not Bowe IO Ltd products are licensed to you by the software provider. Please refer to the license contained in the provider's product for their terms of use.
## 1. License Grant.
Bowe IO Ltd grants you a limited, non-exclusive, non-transferable license to use, **for evaluation/non-production purposes only**, the Bowe IO Ltd software program(s) known as Data Controller for SAS® (the "Software") - and related product documentation - at no charge, subject to the terms and restrictions set forth in this License Agreement. You are not permitted to use the Software in any manner not expressly authorized by this License. You acknowledge and agree that ownership of the Software and all subsequent copies thereof regardless of the form or media are held by Bowe IO Ltd.
## 2. Term of Agreement.
Your license is effective until terminated by Bowe IO Ltd (at the sole discretion of Bowe IO Ltd and without notice). The License will terminate automatically if you fail to comply with any of the limitations or other requirements described herein. At termination you shall cease all use of the Software and destroy all copies, full or partial, of the Software.
## 3. Ownership Rights.
The Software and related documentation are protected by United Kingdom copyright laws and international treaties. Bowe IO Ltd, third party component providers and open source component providers own and retain all right, title and interest in and to the Software and related documentation, including all copyrights, patents, trade secret rights, trademarks and other intellectual property rights therein.
## 4. Use of Name and Trademarks.
You shall not use the name, trade names or trademarks of Bowe IO Ltd or any of its affiliates in any advertising, promotional literature or any other material, whether in written, electronic or other form, without prior approval.
## 5. Restrictions
5.1 - You may not rent, lease, lend, redistribute or sublicense the Software. You may not copy the Software other than to make archival or backup copies - provided that the backup copy includes all copyright or other proprietary notices contained on the original. You may not copy related product documentation. You may not modify, reverse engineer, decompile, or disassemble the Software, except to the extent the such restriction is expressly prohibited by applicable law.
5.2 - Certain components of the Software are provided under various Open Source licenses that have been made available to Bowe IO Ltd. You may modify or replace only these Open-Sourced Components; provided that (i) the resultant Software is used in place of the unmodified Software, on a single computer; and (ii) you otherwise comply with the terms of this License and any applicable licensing terms governing use of the Open-Sourced Components. Bowe IO Ltd is not obligated to provide any maintenance, technical or other support for the resultant Software.
## 6. Exclusion of Warranties.
THE SOFTWARE IS PROVIDED TO LICENSEE "AS IS", AND ANY USE BY LICENSEE OF THE SOFTWARE WILL BE AT LICENSEE'S SOLE RISK. Bowe IO Ltd makes no warrranties relating to the softwtare, and disclaims all warranties (express or implied), including without limitation those of merchantability and fitness for any particular purpose.
## 7. Limitation of Liability.
In no event shall Bowe IO Ltd be liable for any incidental, special, indirect or consequential damages whatsoever, including, without limitation, damages for loss of profits, loss of data, business interrupton or any other commercial damages or losses, arising out of or related to your use or inability to use the Bowe IO Ltd software, however caused, regardless of the theory of liabilty (contract, tort or otherwise) and even if Bowe IO Ltd has been advised of the possibility of such damages.
## 8. Governing law and jurisdiction
8.1 - This agreement and any disputes or claims arising out of or in connection with its subject matter are governed by and construed in accordance with the law of England.
8.2 - The parties irrevocably agree that the courts of England have exclusive jurisdiction to settle any dispute or claim that arises out of or in connection with this agreement.
## 9. Assignment/Transfers.
You may not assign or transfer this Agreement, in whole or in part, without the prior written consent of Bowe IO Ltd. Any attempted assignment or transfer in violation of this Section will be null and void.
## 10.Third Party Acknowledgements
(A) Aspects of the Software utilize or include third party software and other copyrighted material. Acknowledgements, licensing terms and disclaimers for such material are available when accessing the Software on the Bowe IO Ltd website, and your use of such material is governed by their respective terms.
(B) The Software includes certain software provided under various Open Source licenses. You may obtain complete machine-readable copies of the source code and licenses for the Open Source software at the Bowe IO Ltd Open Source website (https://docs.datacontroller.io/licenses). Open Source Software is distributed WITHOUT ANY WARRANTY, without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE
## 11. Severability.
If any provision of this Agreement is held invalid, illegal or unenforceable, the validity, legality and enforceability of any of the remaining provisions of this Agreement shall not in any way be affected or impaired.
## 12. Entire Agreement.
This Agreement is the entire agreement between you and Bowe IO Ltd concerning the Software and all related documentation and supersedes any other prior or contemporaneous agreements or communications with respect to the Software and related documentation, either written or oral.

View File

@ -1,50 +0,0 @@
# Data Controller for SAS® Evaluation Agreement
The terms and conditions contained below constitute a legal agreement. This agreement ("Agreement") contains herein the entire agreement between the licensee ("You") and Macro People Ltd ("Macro People Ltd"). Read this agreement carefully. By downloading, installing, and/or examining the product, you acknowledge:
1 - You are authorized to enter this agreement for and on behalf of your company, and are doing so, and 2 - You have read, understand and agree that you and the company shall be bound by these terms and conditions and every modification and addition provided for.
Software products included with this product that are not Macro People Ltd products are licensed to you by the software provider. Please refer to the license contained in the providers product for their terms of use.
## 1. License Grant.
Macro People Ltd grants you a limited, non-exclusive, non-transferable license to use, **for evaluation/non-production purposes only**, the Macro People Ltd software program(s) known as Data Controller for SAS® (the "Software") - and related product documentation - at no charge, subject to the terms and restrictions set forth in this License Agreement. You are not permitted to use the Software in any manner not expressly authorized by this License. You acknowledge and agree that ownership of the Software and all subsequent copies thereof regardless of the form or media are held by Macro People Ltd.
## 2. Term of Agreement.
Your license is effective until terminated by Macro People Ltd (at the sole discretion of Macro People Ltd and without notice). The License will terminate automatically if you fail to comply with any of the limitations or other requirements described herein. At termination you shall cease all use of the Software and destroy all copies, full or partial, of the Software.
## 3. Ownership Rights.
The Software and related documentation are protected by United Kingdom copyright laws and international treaties. Macro People Ltd, third party component providers and open source component providers own and retain all right, title and interest in and to the Software and related documentation, including all copyrights, patents, trade secret rights, trademarks and other intellectual property rights therein.
## 4. Use of Name and Trademarks.
You shall not use the name, trade names or trademarks of Macro People Ltd or any of its affiliates in any advertising, promotional literature or any other material, whether in written, electronic or other form, without prior approval.
## 5. Restrictions
5.1 - You may not rent, lease, lend, redistribute or sublicense the Software. You may not copy the Software other than to make archival or backup copies - provided that the backup copy includes all copyright or other proprietary notices contained on the original. You may not copy related product documentation. You may not modify, reverse engineer, decompile, or disassemble the Software, except to the extent the such restriction is expressly prohibited by applicable law.
5.2 - Certain components of the Software are provided under various Open Source licenses that have been made available to Macro People Ltd. You may modify or replace only these Open-Sourced Components; provided that (i) the resultant Software is used in place of the unmodified Software, on a single computer; and (ii) you otherwise comply with the terms of this License and any applicable licensing terms governing use of the Open-Sourced Components. Macro People Ltd is not obligated to provide any maintenance, technical or other support for the resultant Software.
## 6. Exclusion of Warranties.
THE SOFTWARE IS PROVIDED TO LICENSEE “AS IS”, AND ANY USE BY LICENSEE OF THE SOFTWARE WILL BE AT LICENSEES SOLE RISK. Macro People Ltd makes no warrranties relating to the softwtare, and disclaims all warranties (express or implied), including without limitation those of merchantability and fitness for any particular purpose.
## 7. Limitation of Liability.
In no event shall Macro People Ltd be liable for any incidental, special, indirect or consequential damages whatsoever, including, without limitation, damages for loss of profits, loss of data, business interrupton or any other commercial damages or losses, arising out of or related to your use or inability to use the Macro People Ltd software, however caused, regardless of the theory of liabilty (contract, tort or otherwise) and even if Macro People Ltd has been advised of the possibility of such damages.
## 8. Governing law and jurisdiction
8.1 - This agreement and any disputes or claims arising out of or in connection with its subject matter are governed by and construed in accordance with the law of England.
8.2 - The parties irrevocably agree that the courts of England have exclusive jurisdiction to settle any dispute or claim that arises out of or in connection with this agreement.
## 9. Assignment/Transfers.
You may not assign or transfer this Agreement, in whole or in part, without the prior written consent of Macro People Ltd. Any attempted assignment or transfer in violation of this Section will be null and void.
## 10.Third Party Acknowledgements
(A) Aspects of the Software utilize or include third party software and other copyrighted material. Acknowledgements, licensing terms and disclaimers for such material are available when accessing the Software on the Macro People Ltd website, and your use of such material is governed by their respective terms.
(B) The Software includes certain software provided under various Open Source licenses. You may obtain complete machine-readable copies of the source code and licenses for the Open Source software at the Macro People Ltd Open Source website (https://docs.datacontroller.io/licenses). Open Source Software is distributed WITHOUT ANY WARRANTY, without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE
## 11. Severability.
If any provision of this Agreement is held invalid, illegal or unenforceable, the validity, legality and enforceability of any of the remaining provisions of this Agreement shall not in any way be affected or impaired.
## 12. Entire Agreement.
This Agreement is the entire agreement between you and Macro People Ltd concerning the Software and all related documentation and supersedes any other prior or contemporaneous agreements or communications with respect to the Software and related documentation, either written or oral.

126
docs/excel.md Normal file
View File

@ -0,0 +1,126 @@
---
layout: article
title: Excel
description: Data Controller can extract all manner of data from within an Excel file (including formulae) ready for ingestion into SAS. All versions of excel are supported.
og_image: https://docs.datacontroller.io/img/excel_results.png
---
# Excel Uploads
Data Controller supports two approaches for importing Excel data into SAS:
- Simple - source range in tabular format, with column names/values that match the target Table. No configuration necessary.
- Complex - data is scattered across multiple ranges in a dynamic (non-fixed) arrangement. Pre-configuration necessary.
Thanks to our pro license of [sheetJS](https://sheetjs.com/), we can support all versions of excel, large workbooks, and fast extracts. We also support the ingest of [password-protected workbooks](/videos#uploading-a-password-protected-excel-file).
Note that data is extracted from excel from _within the browser_ - meaning there is no need for any special SAS modules / products.
A copy of the original Excel file is also uploaded to the staging area. This means that a complete audit trail can be captured, right back to the original source data.
## Simple Excel Uploads
To make a _simple_ extract, select LOAD / Tables / (library/table) and click "UPLOAD" (or drag the file onto the page). No configuration necessary.
![](img/xltables.png)
The rules for data extraction are:
* Scan the each sheet until a row is found with all target columns
* Extract rows until the first *blank primary key value*
This is incredibly flexible, and means:
* data can be anywhere, on any worksheet
* data can start on any row, and any column
* data can be completely surrounded by other data
* columns can be in any order
* additional columns are simply ignored
!!! note
If the excel contains more than one range with the target columns (eg, on different sheets), only the FIRST will be extracted.
Uploaded data may *optionally* contain a column named `_____DELETE__THIS__RECORD_____` - if this contains the value "Yes", the row is marked for deletion.
If loading very large files (eg over 10mb) it is more efficient to use CSV format, as this bypasses the local rendering engine, but also the local DQ checks - so be careful! Examples of local (excel) but not remote (CSV) file checks include:
* Length of character variables - CSV files are truncated at the max target column length
* Length of numeric variables - if the target numeric variable is below 8 bytes then the staged CSV value may be rounded if it is too large to fit
* NOTNULL - this rule is only applied at backend when the constraint is physical (rather than a DC setting)
* MINVAL
* MAXVAL
* CASE
Note that the HARDSELECT_*** hooks are not applied to the rendered Excel values (they are only applied when actively editing a cell).
![image](https://user-images.githubusercontent.com/4420615/233036372-87b8dd02-a4cd-4f19-ac1b-bb9fdc850607.png)
### Formulas
It is possible to configure certain columns to be extracted as formulae, rather than raw values. The target column must be character, and it should be wide enough to support the longest formula in the source data. If the order of values is important, you should include a row number in your primary key.
Configuration is as follows:
![](img/excel_config_setup.png)
Once this is done, you are ready to upload:
<iframe width="560" height="315" src="https://www.youtube.com/embed/Reg803vI2Ak" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
The final table will look like this:
![](img/excel_results.png)
## Complex Excel Uploads
Through the use of "Excel Maps" you can dynamically extract individual cells or entire ranges from anywhere within a workbook - either through absolute / relative positioning, or by reference to a "matched" (search) string.
Configuration is made in the following tables:
1. [MPE_XLMAP_RULES](/tables/mpe_xlmap_rules) - detailed extraction rules for a particular map
2. [MPE_XLMAP_INFO](/tables/mpe_xlmap_info) - optional map-level attributes
Each [rule](/tables/mpe_xlmap_rules) will extract either a single cell or a rectangular range from the source workbook. The target will be [MPE_XLMAP_DATA](/tables/mpe_xlmap_data), or whichever table is configured in [MPE_XLMAP_INFO](/tables/mpe_xlmap_info).
To illustrate with an example - consider the following excel. The yellow cells need to be imported.
![](img/xlmap_example.png)
The [MPE_XLMAP_RULES](/tables/mpe_xlmap_rules) configuration entries _might_ (as there are multiple ways) be as follows:
|XLMAP_ID|XLMAP_RANGE_ID|XLMAP_SHEET|XLMAP_START|XLMAP_FINISH|
|---|---|---|---|---|
|MAP01|MI_ITEM|Current Month|`MATCH B R[1]C[0]: ITEM`|`LASTDOWN`|
|MAP01|MI_AMT|Current Month|`MATCH C R[1]C[0]: AMOUNT`|`LASTDOWN`|
|MAP01|TMI|Current Month|`ABSOLUTE F6`||
|MAP01|CB|Current Month|`MATCH F R[2]C[0]: CASH BALANCE`||
|MAP01|RENT|/1|`MATCH E R[0]C[2]: Rent/mortgage`||
|MAP01|CELL|/1|`MATCH E R[0]C[2]: Cell phone`||
To import the excel, the end user simply needs to navigate to the LOAD tab, choose "Files", select the appropriate map (eg MAP01), and upload. This will stage the new records in [MPE_XLMAP_DATA](/tables/mpe_xlmap_data) which will go through the usual approval process and quality checks. A copy of the source excel file will be attached to each upload.
The corresponding [MPE_XLMAP_DATA](/tables/mpe_xlmap_data) table will appear as follows:
| LOAD_REF | XLMAP_ID | XLMAP_RANGE_ID | ROW_NO | COL_NO | VALUE_TXT |
|---------------|----------|----------------|--------|--------|-----------------|
| DC20231212T154611798_648613_3895 | MAP01 | MI_ITEM | 1 | 1 | Income Source 1 |
| DC20231212T154611798_648613_3895 | MAP01 | MI_ITEM | 2 | 1 | Income Source 2 |
| DC20231212T154611798_648613_3895 | MAP01 | MI_ITEM | 3 | 1 | Other |
| DC20231212T154611798_648613_3895 | MAP01 | MI_AMT | 1 | 1 | 2500 |
| DC20231212T154611798_648613_3895 | MAP01 | MI_AMT | 2 | 1 | 1000 |
| DC20231212T154611798_648613_3895 | MAP01 | MI_AMT | 3 | 1 | 250 |
| DC20231212T154611798_648613_3895 | MAP01 | TMI | 1 | 1 | 3750 |
| DC20231212T154611798_648613_3895 | MAP01 | CB | 1 | 1 | 864 |
| DC20231212T154611798_648613_3895 | MAP01 | RENT | 1 | 1 | 800 |
| DC20231212T154611798_648613_3895 | MAP01 | CELL | 1 | 1 | 45 |
### Video
<iframe title="Complex Excel Uploads" width="560" height="315" src="https://vid.4gl.io/videos/embed/3338f448-e92d-4822-b3ec-7f6d7530dfc8?peertubeLink=0" frameborder="0" allowfullscreen="" sandbox="allow-same-origin allow-scripts allow-popups"></iframe>

42
docs/files.md Normal file
View File

@ -0,0 +1,42 @@
# Data Controller for SAS: File Uploads
Data Controller supports the ingestion of two file formats - Excel (any version) and CSV.
If you would like to support other file types, do [get in touch](https://datacontroller.io/contact)!
## Excel Uploads
Data can be uploaded in regular (tabular) or dynamic (complex) format. For details, see the [excel](/excel) page.
## CSV Uploads
The following should be considered when uploading data in this way:
- A header row (with variable names) is required
- Variable names must match those in the target table (not case sensitive). An easy way to ensure this is to download the data from Viewer and use this as a template.
- Duplicate variable names are not permitted
- Missing columns are not permitted
- Additional columns are ignored
- The order of variables does not matter EXCEPT for the (optional) `_____DELETE__THIS__RECORD_____` variable. When using this variable, it must be the **first**.
- The delimiter is extracted from the header row - so for `var1;var2;var3` the delimeter would be assumed to be a semicolon
- The above assumes the delimiter is the first special character! So `var,1;var2;var3` would fail
- The following characters should **not** be used as delimiters
- doublequote
- quote
- space
- underscore
When loading dates, be aware that Data Controller makes use of the `ANYDTDTE` and `ANYDTDTTME` informats (width 19).
This means that uploaded date / datetime values should be unambiguous (eg `01FEB1942` vs `01/02/42`), to avoid confusion - as the latter could be interpreted as `02JAN2042` depending on your locale and options `YEARCUTOFF` settings. Note that UTC dates with offset values (eg `2018-12-26T09:19:25.123+0100`) are not currently supported. If this is a feature you would like to see, contact us.
!!! tip
To get a copy of a file in the right format for upload, use the [file download](/dc-userguide/#usage) feature in the Viewer tab
!!! warning
Lengths are taken from the target table. If a CSV contains long strings (eg `"ABCDE"` for a $3 variable) then the rest will be silently truncated (only `"ABC"` staged and loaded). If the target variable is a short numeric (eg 4., or 4 bytes) then floats or large integers may be rounded. This issue does not apply to excel uploads, which are first validated in the browser.
When loading CSVs, the entire file is passed to backend for ingestion. This makes it more efficient for large files, but does mean that frontend validations are bypassed.

26
docs/filter.md Normal file
View File

@ -0,0 +1,26 @@
---
layout: article
title: Filter
description: Data Controller for SAS&reg; enable complex filters to be created on any variable. The "dynamic" where claue setting enables new values to be filtered by remaining filter clauses. Filtered views are shareable!
og_image: https://docs.datacontroller.io/img/filter_dynamic_on.png
---
# Filtering
Data Controller for SAS&reg; enables you to create complex table filters. The "dynamic" setting enables the dropdown values to be pre-filtered by previous filter clauses. Filtered views are shareable!
## Shared Filters
When filters are submitted, the query is stored, and a unique URL is generated. This means you can share the link to a filtered view of a table! This can be used for VIEW, for EDIT and also for downloading data.
![](img/filter_url.png)
## Dynamic Where Clause
When filtering *without* a dynamic where clause, all values are always returned in the selection box.
![](img/filter_dynamic_off.png)
By contrast, when the dynamic where clause box is checked (default), the values in the *second and subsequent* filter clauses are filtered by the previous filter clause settings, eg:
![](img/filter_dynamic_on.png)

27
docs/formats.md Normal file
View File

@ -0,0 +1,27 @@
---
layout: article
title: API
description: Viewing and Modifying SAS Format Catalogs in Data Controller
---
# Formats
Data Controller allows formats to be viewed and edited directly from the web interface - avoiding the need to create and maintain parallel 'CNTLIN' datasets.
Formats are displayed with a special icon (`bolt`), in the same library as other tables (in both the VIEW and EDIT screens):
![formats](img/formats.png)
Viewing or editing a format catalog will always mean that the entire catalog is exported, before being filtered (if filters applied) and displayed. For this reason, it is recommended to split a large format catalog over several catalogs, if performance is a consideration.
The usual export mechanisms can also be applied - you can downlad the DDL, or export the catalog in CSV / Excel / Datalines / Markdown / DDL formats.
When adding a format to MPE_TABLES, the `DSN` should contain the format catalog name plus a `-FC` extension. The LOADTYPE should be `FORMAT_CAT` and the BUSKEY should be `FMTNAME START`. HOOK scripts can also be applied (ie, run some DQ after an edit, or re-run a batch job after an approval).
Example:
|LIBREF:$8.|DSN:$32.|LOADTYPE:$12.|BUSKEY:$1000.|
|---|---|---|---|
|`MYLIB `|`FORMATS-FC `|`FORMAT_CAT `|`FMTNAME START `|
Just like regular table edits, all changes to formats are logged in the `MPE_AUDIT` table.

BIN
docs/img/cannotimport.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 43 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 73 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 377 KiB

BIN
docs/img/cls_example.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 678 KiB

BIN
docs/img/cls_table.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 333 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.7 KiB

BIN
docs/img/datacontroller.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 59 KiB

Binary file not shown.

1
docs/img/dataflow.svg Normal file

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 46 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 73 KiB

BIN
docs/img/dataflow_white.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 104 KiB

BIN
docs/img/dc.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 16 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 96 KiB

BIN
docs/img/dcbackground.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 229 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 49 KiB

After

Width:  |  Height:  |  Size: 87 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 48 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 94 KiB

File diff suppressed because one or more lines are too long

After

Width:  |  Height:  |  Size: 54 KiB

BIN
docs/img/dcu-jobmeta.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

BIN
docs/img/dcu_flow.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 13 KiB

BIN
docs/img/dcvid.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 22 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

BIN
docs/img/excel_map.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 254 KiB

BIN
docs/img/excel_results.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 93 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 163 KiB

BIN
docs/img/filter_url.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 387 KiB

BIN
docs/img/formats.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 37 KiB

BIN
docs/img/li_background.jpeg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 280 KiB

BIN
docs/img/libraries.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 257 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 53 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 29 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 20 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 23 KiB

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 MiB

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 112 KiB

View File

@ -0,0 +1,35 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 22.1.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 2000 520.7" style="enable-background:new 0 0 2000 520.7;" xml:space="preserve">
<style type="text/css">
.st0{fill:#E0E0E0;}
.st1{fill:#8EC63F;}
.st2{font-family:'APompadourBoldSample';}
.st3{font-size:205.7454px;}
.st4{letter-spacing:5;}
</style>
<path class="st0" d="M321.9,198.2c3.3,0.4,6.5,0.7,9.8,1.1c7.9,1.1,15.6,2.6,22.8,6.3c2.1,1.1,4.1,2.3,5.6,4.2
c2.4,2.8,2.3,5.8-0.1,8.6c-2.5,2.8-5.7,4.4-9.1,5.7c-7.9,3.1-16.2,4.5-24.6,5.2c-13.7,1.1-27.2,0.6-40.6-2.6
c-4.4-1-8.7-2.4-12.6-4.6c-1.5-0.8-2.9-1.8-4.2-3c-3.6-3.5-3.6-7.2,0.1-10.5c4-3.6,8.9-5.3,13.9-6.7c7-2,14.1-3,21.3-3.5
c0.6,0,1.3,0.2,1.8-0.3C311.2,198.2,316.6,198.2,321.9,198.2z"/>
<path class="st1" d="M266.2,280.3c6.4,8.2,15.5,11.2,24.9,13.2c18.6,4,37.2,3.5,55.3-2.9c5.6-2,10.8-4.7,14.5-9.7
c0.6-0.8,0.9-0.4,1.2,0.2c0.6,1.3,0.9,2.6,0.9,4c0,4.9,0,9.9,0,14.8c0,3.2-1.5,5.7-3.7,7.9c-4.6,4.6-10.4,7.1-16.5,9
c-12.3,3.8-24.9,4.6-37.6,3.7c-9.5-0.7-18.7-2.5-27.4-6.4c-3.7-1.7-7.1-3.7-9.8-6.8c-2.1-2.4-3.2-5.1-3.2-8.3c0.1-4.5,0-9,0-13.5
C264.8,283.7,265.1,282,266.2,280.3z"/>
<path class="st0" d="M266.3,249.4c5.3,7.1,12.7,10.1,20.6,12.2c18.9,5,37.8,4.9,56.5-0.7c6.2-1.8,12-4.5,16.5-9.4
c0.6-0.6,1-2,1.7-1.8c1,0.3,1,1.7,1.3,2.8c0.4,1.7,0.2,3.4,0.2,5.1c0,3.4-0.1,6.7,0,10.1c0.2,4.2-1.5,7.5-4.6,10.2
c-4.7,4.2-10.4,6.7-16.5,8.3c-20.5,5.5-40.9,5.4-61-1.6c-4.9-1.7-9.4-4.1-13-8c-2.2-2.4-3.3-5.2-3.3-8.5c0.1-4.4,0-8.9,0-13.3
C264.8,253,265.1,251.2,266.3,249.4z"/>
<path class="st1" d="M266.1,219.7c1.5,4.4,5.1,6.4,8.9,8.2c6.8,3.2,14.1,4.7,21.4,5.6c14.5,1.8,29,1.6,43.3-1.4
c5.8-1.2,11.4-2.8,16.5-6c2.9-1.8,4.4-3.5,5.4-6.3c1,1.4,1.3,2.9,1.3,4.4c0,5,0.1,9.9,0,14.9c-0.1,4.2-2.5,7.2-5.6,9.7
c-5.8,4.6-12.6,7-19.7,8.6c-18.8,4.1-37.4,3.8-55.7-2.4c-4.8-1.6-9.4-3.9-13.1-7.5c-2.8-2.7-4.3-5.8-4.2-9.8c0.1-4.3,0-8.6,0-12.9
C264.8,222.9,265.1,221.3,266.1,219.7z"/>
<path class="st0" d="M380.9,353.9c16.3,0.9,25-19.1,13.2-30.4c-12.4-11.9-32.8-0.8-29.6,16c-73.7,47-168.5-23.3-141.5-108.7
l-18.2-12.2C163.9,327.3,289.8,419.2,380.9,353.9z M375.7,330.4c7.6-7.9,19.7,3.1,12.6,11.4C380.8,350.5,367.6,338.8,375.7,330.4z"
/>
<path class="st1" d="M232.2,193.7c11.7,11.2,31,2,29.9-14c73.6-47.4,168.9,22.8,141.9,108.5l18.2,12.2
c39.7-105.6-80.8-200.7-174-136.8C230.7,160,219.4,181.4,232.2,193.7z M250.6,186.8c-7.8,8.1-20.1-3.7-12.3-11.8
C246.1,166.9,258.4,178.7,250.6,186.8z"/>
<text transform="matrix(1 0 0 1 500.2032 330.8743)"><tspan x="0" y="0" class="st0 st2 st3 st4">Data</tspan><tspan x="463.1" y="0" class="st1 st2 st3 st4">Controller</tspan></text>
</svg>

After

Width:  |  Height:  |  Size: 2.7 KiB

BIN
docs/img/logos/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 70 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 100 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.8 KiB

BIN
docs/img/macroLogo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 100 KiB

BIN
docs/img/mpe_audit.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.1 MiB

BIN
docs/img/mpe_config.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.2 MiB

BIN
docs/img/mpe_requests.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 109 KiB

BIN
docs/img/mpe_review.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 276 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 316 KiB

After

Width:  |  Height:  |  Size: 240 KiB

BIN
docs/img/mpe_submit.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 208 KiB

BIN
docs/img/mpe_xlmap_data.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 150 KiB

BIN
docs/img/mpe_xlmap_info.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 184 KiB

BIN
docs/img/restore.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 51 KiB

Some files were not shown because too many files have changed in this diff Show More