114 Commits

Author SHA1 Message Date
625af199f4 chore(release): 6.4.0 [skip ci]
# [6.4.0](https://git.datacontroller.io/dc/dc/compare/v6.3.1...v6.4.0) (2024-01-24)

### Bug Fixes

* add dcLib to globals ([5d93346](5d93346b52))
* add service to get xlmap rules and fixed interface name ([9ffa30a](9ffa30ab74))
* increasing length of mpe_excel_map cols to ([2d4d068](2d4d068413))
* providing info on mapids to FE ([fd94945](fd94945466))
* removing tables from EDIT menu that are in xlmaps ([9550ae4](9550ae4d11))
* removing XLMAP_TARGETLIBDS from mpe_xlmaps_rules table ([93702c6](93702c63dc))
* renaming TABLE macvar to LOAD_REF in postdata.sas ([01915a2](01915a2db9))
* reverting xlmap in getdata change ([2d6e747](2d6e747db9))
* update edit tab to load ([516e5a2](516e5a2062))

### Features

* adding ability to define the target table for excel maps ([c86fba9](c86fba9dc7))
* adding ismap attribute to getdata response (and fixing test) ([2702bb3](2702bb3c84))
* Complex Excel Uploads ([cf19381](cf19381060)), closes [#69](#69)
* Create Tables / Files dropdown under load tab ([b473b19](b473b198a6))
* display list of maps in sidebar ([5aec024](5aec024242))
* implemented the logic for xlmap component ([50696bb](50696bb926))
* model changes for [#69](#69) ([271543a](271543a446))
* new getxlmaps service to return rules for a particular xlmap_id ([56264ec](56264ecc69))
* validating the excel map after stage (adding load-ref) ([a485c3b](a485c3b787))
2024-01-24 17:50:00 +00:00
56e9217f4b Merge pull request 'ci: semantic release requires node 20 or above' (#74) from ci-deploy into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 4m48s
Release / Build-and-test-development (push) Successful in 8m32s
Release / release (push) Successful in 7m1s
Reviewed-on: #74
2024-01-24 17:34:51 +00:00
86f1af7926 Merge branch 'main' into ci-deploy
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 17s
2024-01-24 17:34:21 +00:00
7737f8455d ci: semantic release requires node 20 or above
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 17s
2024-01-24 18:33:50 +01:00
b0f1677fcc Merge pull request 'Fixed mocked startupservice used for cypress testing' (#73) from ci-deploy into main
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 4m47s
Release / Build-and-test-development (push) Successful in 8m35s
Release / release (push) Failing after 1m46s
Reviewed-on: #73
2024-01-24 16:40:02 +00:00
4406e0d4b4 ci: fixed mocked startupservice used for cypress testing
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 16s
2024-01-24 17:34:07 +01:00
cf19381060 feat: Complex Excel Uploads
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 4m49s
Release / release (push) Has been skipped
Release / Build-and-test-development (push) Failing after 9m24s
Reviewed-on: #71

Closes #69
2024-01-24 13:48:07 +00:00
2a852496e9 chore: add specs
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 17s
2024-01-24 17:05:18 +05:00
4653097225 chore: move utils to separate file
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 23s
2024-01-24 15:30:22 +05:00
8afee29e02 chore: limit submitting rows based on liscence
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 17s
2024-01-24 14:28:56 +05:00
233eca39ef chore: move utility functions to separate file
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 16s
2024-01-24 14:10:11 +05:00
93702c63dc fix: removing XLMAP_TARGETLIBDS from mpe_xlmaps_rules table
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 17s
2024-01-23 16:54:44 +00:00
df065562d1 chore: bumping core
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 1m9s
2024-01-23 12:10:18 +00:00
802c99adf9 chore: fix .npmrc
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-22 10:30:19 +00:00
482c7455f5 chore: fix the logic for goback button in stage component
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-19 21:03:34 +05:00
731b96dccc chore: modifired xlmaps array in global variables
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 14s
2024-01-19 18:48:38 +05:00
9550ae4d11 fix: removing tables from EDIT menu that are in xlmaps
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-19 11:12:31 +00:00
2d6e747db9 fix: reverting xlmap in getdata change
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2024-01-19 10:52:39 +00:00
fd94945466 fix: providing info on mapids to FE
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-18 17:39:10 +00:00
d3b0c09332 chore: in editors/loadfile service pass attached excel file too as payload
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-18 22:19:04 +05:00
01915a2db9 fix: renaming TABLE macvar to LOAD_REF in postdata.sas
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
also adding a sample post approve hook for xlmap dataloads
2024-01-18 16:31:11 +00:00
51b043b6d2 chore: postedit hook example updates
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-18 15:43:48 +00:00
c144fd8087 chore: fixed hanging state after getting error in upload and submit
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-18 18:39:23 +05:00
12b15df78c chore: move to data tab after extracting data
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-17 22:38:26 +05:00
d6ecd12cea chore: added tab view in xlmap component
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2024-01-17 22:33:42 +05:00
1c3d498da6 chore: wording of rules page
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2024-01-17 11:23:07 +00:00
d75e10aef5 chore: quick fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-16 15:25:52 +05:00
f0f9d85558 chore: quick fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-16 14:27:44 +05:00
86f3411896 Merge pull request 'feat: complex excel upload (UI)' (#72) from issue-69-ui into issue69
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
Reviewed-on: #72
2024-01-16 09:16:43 +00:00
6daef39268 chore: add modal for displaying submit limit notice
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-16 13:56:47 +05:00
7d1720a360 Merge branch 'issue69' into issue-69-ui
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-16 08:47:47 +00:00
b11a4884b4 chore: quick fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 14s
2024-01-16 13:00:15 +05:00
50696bb926 feat: implemented the logic for xlmap component
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-16 12:21:45 +05:00
d67d4e2f86 chore: added xlmap.module.ts file 2024-01-16 12:19:52 +05:00
2f01c4d251 chore: added xlmap routing component 2024-01-16 12:17:42 +05:00
9ffa30ab74 fix: add service to get xlmap rules and fixed interface name 2024-01-16 12:14:28 +05:00
5d93346b52 fix: add dcLib to globals 2024-01-16 12:08:42 +05:00
39762b36c6 chore: updated workspace settings 2024-01-16 12:04:39 +05:00
e40ebdff05 Merge branch 'main' into issue69
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-12 13:09:39 +00:00
8d12d9e51e chore: fixing validations
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-11 18:48:13 +00:00
23708c9aae chore: fix xlmap validation logic
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2024-01-11 18:28:51 +00:00
c86fba9dc7 feat: adding ability to define the target table for excel maps
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-11 18:11:22 +00:00
e747e6e4e7 chore: additional xlmaps to cover LASTDOWN and BLANKROW scenarios
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 14s
2024-01-11 14:44:13 +00:00
5aec024242 feat: display list of maps in sidebar
implemented routing module/component for home-routing
2024-01-04 11:33:38 +05:00
b473b198a6 feat: Create Tables / Files dropdown under load tab 2024-01-04 11:28:16 +05:00
516e5a2062 fix: update edit tab to load 2024-01-04 11:26:13 +05:00
fb3abbe491 chore: update workspace settings 2024-01-04 11:24:39 +05:00
3e009f3037 chore: adding migration for new tables
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-03 12:12:38 +00:00
e63d304953 chore(release): 6.3.1 [skip ci]
## [6.3.1](https://git.datacontroller.io/dc/dc/compare/v6.3.0...v6.3.1) (2024-01-01)

### Bug Fixes

* enabling excel uploads to tables with retained keys, also adding more validation to MPE_TABLES updates ([3efccc4](3efccc4cf3))
2024-01-01 17:53:14 +00:00
3cd90c2d47 Merge pull request 'fix: enabling excel uploads to tables with retained keys, also adding more validation to MPE_TABLES updates' (#67) from dcfixes into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 3m11s
Release / Build-and-test-development (push) Successful in 6m31s
Release / release (push) Successful in 5m16s
Reviewed-on: #67
2024-01-01 17:42:07 +00:00
a485c3b787 feat: validating the excel map after stage (adding load-ref)
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-01 16:53:50 +00:00
2702bb3c84 feat: adding ismap attribute to getdata response (and fixing test)
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-01 16:07:47 +00:00
56264ecc69 feat: new getxlmaps service to return rules for a particular xlmap_id
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-01 14:50:02 +00:00
cc4535245c chore: adding xlmaps in startupservice response, #69
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-01 14:10:49 +00:00
6ae31de1dd chore: adding sample data for basel KM1 template
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2024-01-01 13:54:08 +00:00
2d4d068413 fix: increasing length of mpe_excel_map cols to
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 14s
2024-01-01 12:23:07 +00:00
271543a446 feat: model changes for #69
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 47s
2023-12-27 16:57:48 +01:00
ac59b77ad5 Merge branch 'main' into dcfixes
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 46s
2023-12-12 08:30:25 +00:00
3efccc4cf3 fix: enabling excel uploads to tables with retained keys, also adding more validation to MPE_TABLES updates
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 13s
2023-12-12 08:27:45 +00:00
8cbcd18f4b chore(release): 6.3.0 [skip ci]
# [6.3.0](https://git.datacontroller.io/dc/dc/compare/v6.2.8...v6.3.0) (2023-12-04)

### Features

* viewer row handle ([dadac4f](dadac4f13f))
2023-12-04 18:49:31 +00:00
6bb2378790 Merge pull request 'ci: doxygen fix' (#66) from ci-fix into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 3m8s
Release / Build-and-test-development (push) Successful in 6m27s
Release / release (push) Successful in 5m23s
Reviewed-on: #66
2023-12-04 18:38:27 +00:00
e7d0ffe8c0 Merge branch 'main' into ci-fix
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 48s
2023-12-04 17:38:00 +00:00
ab89600c73 style: lint
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 47s
2023-12-04 18:37:05 +01:00
830e3816a0 ci: build, syntax fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2023-12-04 18:32:17 +01:00
dadac4f13f feat: viewer row handle
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 0s
2023-12-04 18:17:49 +01:00
1de48a49af ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 0s
2023-12-04 17:06:18 +01:00
687a1e1cb5 ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 33s
2023-12-04 17:02:08 +01:00
665a04f5c5 ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 14s
2023-12-04 17:01:26 +01:00
fdb18d242b ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 19s
2023-12-04 14:47:30 +01:00
ec173da4ce ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 9s
2023-12-04 14:47:02 +01:00
bb35cc15d2 ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 8s
2023-12-04 14:46:32 +01:00
181f52eaea ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 8s
2023-12-04 14:45:57 +01:00
fc7c8101ed ci: doxygen fix
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 8s
2023-12-04 14:44:49 +01:00
a347603fe0 chore(release): 6.2.8 [skip ci]
## [6.2.8](https://git.datacontroller.io/dc/dc/compare/v6.2.7...v6.2.8) (2023-12-04)

### Bug Fixes

* bumping sasjs/core to fix mp_loadformat issue ([a1d308e](a1d308ea07))
* new logic for -fc suffix.  Closes [#63](#63) ([5579db0](5579db0eaf))
2023-12-04 11:51:52 +00:00
09022c995f chore: prettier fix
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 3m5s
Release / Build-and-test-development (push) Successful in 6m15s
Release / release (push) Failing after 4m56s
2023-12-04 12:41:04 +01:00
3609943f30 Merge pull request 'fix: new logic for -fc suffix, plus fixes for format record additions / deletions' (#64) from dcfixes into main
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 3m5s
Release / Build-and-test-development (push) Failing after 7m12s
Release / release (push) Has been skipped
Reviewed-on: #64
2023-12-03 13:56:26 +00:00
a1d308ea07 fix: bumping sasjs/core to fix mp_loadformat issue
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 12s
2023-12-03 13:53:53 +00:00
5579db0eaf fix: new logic for -fc suffix. Closes #63
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 52s
2023-12-03 11:19:40 +00:00
3a3e488b23 chore: updating yaml to use self-hosted doc site
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 7m20s
Release / Build-and-test-development (push) Successful in 13m43s
Release / release (push) Failing after 2m36s
2023-11-14 22:23:05 +00:00
0a82ec0a70 chore: updating link in README to releases page
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 7m50s
Release / Build-and-test-development (push) Successful in 14m11s
Release / release (push) Failing after 2m42s
2023-11-14 21:43:55 +00:00
bc1d89218e chore(release): 6.2.7 [skip ci]
## [6.2.7](https://git.datacontroller.io/dc/dc/compare/v6.2.6...v6.2.7) (2023-11-09)

### Bug Fixes

* **audit:** updated crypto-js (hashing rows in dynamic cell validation) ([a7aa42a](a7aa42a59b))
* missing dependency and avoiding label length limit issue ([91f128c](91f128c2fe))
2023-11-09 09:01:38 +00:00
817b9adeac Merge pull request 'fix(audit): updated crypto-js (hashing rows in dynamic cell validation)' (#62) from audit-fix into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 8m0s
Release / Build-and-test-development (push) Successful in 13m41s
Release / release (push) Successful in 12m7s
Reviewed-on: #62
2023-11-09 08:37:19 +00:00
a7aa42a59b fix(audit): updated crypto-js (hashing rows in dynamic cell validation)
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m45s
2023-11-09 09:20:21 +01:00
34f239036d Merge pull request 'fix: missing dependency and avoiding label length limit issue' (#61) from corebump into main
Some checks failed
Release / Build-production-and-ng-test (push) Failing after 2m51s
Release / Build-and-test-development (push) Has been skipped
Release / release (push) Has been skipped
Reviewed-on: #61
2023-11-08 21:45:25 +00:00
91f128c2fe fix: missing dependency and avoiding label length limit issue
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m43s
Relates to the following core issues:
* https://github.com/sasjs/core/issues/364
* https://github.com/sasjs/core/issues/363
2023-11-08 21:36:17 +00:00
a00ebea692 chore(release): 6.2.6 [skip ci]
## [6.2.6](https://git.datacontroller.io/dc/dc/compare/v6.2.5...v6.2.6) (2023-10-18)

### Bug Fixes

* bumping core to address mm_assigndirectlib issue ([c27cdab](c27cdab3fc))
2023-10-18 10:48:48 +00:00
c27cdab3fc fix: bumping core to address mm_assigndirectlib issue
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 7m19s
Release / Build-and-test-development (push) Successful in 11m45s
Release / release (push) Successful in 11m9s
2023-10-18 11:26:29 +01:00
d7da2d7890 chore(release): 6.2.5 [skip ci]
## [6.2.5](https://git.datacontroller.io/dc/dc/compare/v6.2.4...v6.2.5) (2023-10-17)

### Bug Fixes

* enabling AUTHDOMAIN in MM_ASSIGNDIRECTLIB ([008b45a](008b45ad17))
2023-10-17 16:06:19 +00:00
76f0fd4232 Merge pull request 'fix: enabling AUTHDOMAIN in MM_ASSIGNDIRECTLIB for ODBC Engines' (#60) from issue59 into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 7m8s
Release / Build-and-test-development (push) Successful in 11m34s
Release / release (push) Successful in 10m54s
Reviewed-on: #60
2023-10-17 15:44:34 +00:00
008b45ad17 fix: enabling AUTHDOMAIN in MM_ASSIGNDIRECTLIB
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m19s
Bumped sasjs/core to 4.48.1
2023-10-17 16:42:05 +01:00
4d49263816 chore(release): 6.2.4 [skip ci]
## [6.2.4](https://git.datacontroller.io/dc/dc/compare/v6.2.3...v6.2.4) (2023-10-16)

### Bug Fixes

* Enable display of metadata-only tables. Closes [#56](#56) ([f3e82b4](f3e82b4ee2))
2023-10-16 16:38:06 +00:00
6a2482e5c6 Merge pull request 'fix: Enable display of metadata-only tables. Closes #56' (#57) from issue56 into main
All checks were successful
Release / Build-production-and-ng-test (push) Successful in 7m19s
Release / Build-and-test-development (push) Successful in 11m26s
Release / release (push) Successful in 11m9s
Reviewed-on: #57
2023-10-16 15:10:04 +00:00
b5c3fb2af4 Merge branch 'main' into issue56
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m18s
2023-10-16 14:55:11 +00:00
fa2c8eb839 Merge pull request 'Adding cypress videos to the artifacts' (#58) from ci into main
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 7m5s
Release / Build-and-test-development (push) Successful in 11m21s
Release / release (push) Failing after 2m6s
Reviewed-on: #58
2023-10-16 14:24:10 +00:00
f3e82b4ee2 fix: Enable display of metadata-only tables. Closes #56
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m21s
2023-10-16 15:21:01 +01:00
ba67248155 chore: adding cypress videos to the artifacts
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m21s
2023-10-16 16:11:56 +02:00
a6d962bfaa Merge pull request 'Downgrading SheetJS' (#55) from sheetjs-downgrade into main
Some checks reported warnings
Release / Build-production-and-ng-test (push) Has been cancelled
Release / Build-and-test-development (push) Has been cancelled
Release / release (push) Has been cancelled
Reviewed-on: #55
2023-10-16 10:48:02 +00:00
95cddb52d4 chore: package-lock
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m22s
2023-10-16 12:40:17 +02:00
5a5118d775 chore: downgrading sheetjs
Some checks failed
Build / Build-and-ng-test (pull_request) Failing after 27s
2023-10-16 12:34:30 +02:00
1b1cdd7a4b Merge pull request 'Removing development branch jobs, moving test jobs to the release pipeline' (#54) from discard-development into main
Some checks failed
Release / Build-production-and-ng-test (push) Successful in 7m9s
Release / Build-and-test-development (push) Failing after 14m17s
Release / release (push) Has been skipped
Reviewed-on: #54
2023-10-16 09:46:34 +00:00
a9ddf7f7dd chore: removing development branch JOBS, moving test jobs to the release pipeline
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m23s
2023-10-13 11:24:42 +02:00
b54b3f1778 chore(release): 6.2.3 [skip ci]
## [6.2.3](https://git.datacontroller.io/dc/dc/compare/v6.2.2...v6.2.3) (2023-10-12)

### Bug Fixes

* bumping core library to avoid non-ascii char in mp_validatecols.sas. [#50](#50) ([11b06f6](11b06f6416))
* removing copyright symbol from mpe_alerts macro. [#50](#50) ([adb7eb7](adb7eb7755))
2023-10-12 09:42:44 +00:00
349a63c591 Merge pull request 'fix: removing non-ascii chars from SAS program headers' (#51) from issue50 into main
All checks were successful
Release / release (push) Successful in 11m21s
Reviewed-on: #51
2023-10-12 09:40:30 +00:00
293d33912f Merge pull request 'development' (#52) from development into issue50
Some checks reported warnings
Build / Build-and-ng-test (pull_request) Has been cancelled
Reviewed-on: #52
2023-10-12 09:38:43 +00:00
357b9849e7 Merge branch 'issue50' into development
Some checks reported warnings
Build / Build-and-ng-test (pull_request) Has been cancelled
Test / Build-and-test-development (push) Has been cancelled
Test / Build-and-test-development-latest-adapter (push) Has been cancelled
Test / Build-production-and-ng-test (push) Has been cancelled
2023-10-12 09:38:19 +00:00
0c8a9eef32 chore(ci): build and ng test fix
Some checks failed
Test / Build-production-and-ng-test (push) Successful in 7m30s
Test / Build-and-test-development (push) Failing after 14m29s
Test / Build-and-test-development-latest-adapter (push) Failing after 14m45s
Build / Build-and-ng-test (pull_request) Successful in 1m25s
2023-10-12 08:33:31 +02:00
112b1d0da4 chore(ci): audit check fix
Some checks failed
Test / Build-production-and-ng-test (push) Failing after 2m22s
Test / Build-and-test-development-latest-adapter (push) Has been cancelled
Test / Build-and-test-development (push) Has been cancelled
2023-10-12 08:26:35 +02:00
a05007416a chore(ci): audit check fix
Some checks failed
Test / Build-production-and-ng-test (push) Failing after 2m25s
Test / Build-and-test-development-latest-adapter (push) Has been cancelled
Test / Build-and-test-development (push) Has been cancelled
2023-10-12 08:21:58 +02:00
9f7dd55583 Merge pull request 'Release fix' (#49) from release-fix into development
Some checks failed
Test / Build-production-and-ng-test (push) Failing after 2m36s
Test / Build-and-test-development (push) Failing after 14m29s
Test / Build-and-test-development-latest-adapter (push) Failing after 14m30s
Reviewed-on: #49
2023-10-11 22:02:20 +00:00
11b06f6416 fix: bumping core library to avoid non-ascii char in mp_validatecols.sas. #50
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m30s
2023-10-11 22:57:07 +01:00
adb7eb7755 fix: removing copyright symbol from mpe_alerts macro. #50 2023-10-11 22:56:14 +01:00
b776b80728 chore: making semantic-release fail if no release available
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m17s
2023-10-11 16:34:06 +02:00
73a149ea7b chore: updated contributing.md with release instructions
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m18s
2023-10-09 13:02:13 +02:00
ef8784093b chore: release is draft fix, release will update package.json version
All checks were successful
Build / Build-and-ng-test (pull_request) Successful in 1m18s
2023-10-09 12:46:17 +02:00
79 changed files with 11858 additions and 1313 deletions

View File

@ -1,5 +1,5 @@
name: Build name: Build
run-name: Running Lint Check run-name: Running Lint Check and Licence checker on Pull Request
on: [pull_request] on: [pull_request]
jobs: jobs:
@ -18,8 +18,11 @@ jobs:
env: env:
NPMRC: ${{ secrets.NPMRC}} NPMRC: ${{ secrets.NPMRC}}
- run: npm run lint:check - name: Lint check
- run: | run: npm run lint:check
- name: Licence checker
run: |
cd client cd client
npm ci npm ci
npm run license-checker npm run license-checker

View File

@ -1,224 +0,0 @@
name: Test
run-name: Building and testing development branch
on:
push:
branches:
- development
jobs:
Build-production-and-ng-test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
- name: Write .npmrc file
run: |
touch client/.npmrc
echo '${{ secrets.NPMRC}}' > client/.npmrc
- name: Install Chrome for Angular tests
run: |
apt-get update
wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
apt install -y ./google-chrome*.deb;
export CHROME_BIN=/usr/bin/google-chrome
- name: Write cypress credentials
run: echo "$CYPRESS_CREDS" > ./client/cypress.env.json
shell: bash
env:
CYPRESS_CREDS: ${{ secrets.CYPRESS_CREDS }}
- name: Install dependencies
run: npm ci
- name: Check audit
# Audit should fail and stop the CI if critical vulnerability found
run: |
npm audit --audit-level=critical
cd ./sas
npm audit --audit-level=critical
cd ./client
npm audit --audit-level=critical
- name: Angular Tests
run: |
npm test -- --no-watch --no-progress --browsers=ChromeHeadlessCI
- name: Angular Production Build
run: |
npm run postinstall
npm run build
Build-and-test-development:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
- name: Write .npmrc file
run: |
touch client/.npmrc
echo '${{ secrets.NPMRC}}' > client/.npmrc
- run: apt-get update
- run: wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
- run: apt install -y ./google-chrome*.deb;
- run: export CHROME_BIN=/usr/bin/google-chrome
- run: apt-get update -y
- run: apt-get -y install libgtk2.0-0 libgtk-3-0 libgbm-dev libnotify-dev libgconf-2-4 libnss3 libxss1 libasound2 libxtst6 xauth xvfb
- run: apt -y install jq
- name: Write cypress credentials
run: echo "$CYPRESS_CREDS" > ./client/cypress.env.json
shell: bash
env:
CYPRESS_CREDS: ${{ secrets.CYPRESS_CREDS }}
- name: Install dependencies
run: npm ci
# Install pm2 and prepare SASJS server
- run: npm i -g pm2
- run: curl -L https://github.com/sasjs/server/releases/latest/download/linux.zip > linux.zip
- run: unzip linux.zip
- run: touch .env
- run: echo RUN_TIMES=js >> .env
- run: echo NODE_PATH=node >> .env
- run: echo CORS=enable >> .env
- run: echo WHITELIST=http://localhost:4200 >> .env
- run: cat .env
- run: pm2 start api-linux --wait-ready
- name: Deploy mocked services
run: |
cd ./sas/mocks/sasjs
npm install -g @sasjs/cli
npm install -g replace-in-files-cli
sasjs cbd -t server-ci
# sasjs request services/admin/makedata -t server-ci -d ./deploy/makeData4GL.json -c ./deploy/requestConfig.json -o ./output.json
- name: Install ZIP
run: |
apt-get update
apt-get install zip
- name: Prepare and run frontend and cypress
run: |
cd ./client
mv ./cypress.env.example.json ./cypress.env.json
replace-in-files --regex='"username".*' --replacement='"username":"'${{ secrets.CYPRESS_USERNAME_SASJS }}'",' ./cypress.env.json
replace-in-files --regex='"password".*' --replacement='"password":"'${{ secrets.CYPRESS_PWD_SASJS }}'" ' ./cypress.env.json
cat ./cypress.env.json
npm run postinstall
# Prepare index.html to SASJS local
replace-in-files --regex='serverUrl=".*?"' --replacement='serverUrl="http://localhost:5000"' ./src/index.html
replace-in-files --regex='appLoc=".*?"' --replacement='appLoc="/Public/app/devtest"' ./src/index.html
replace-in-files --regex='serverType=".*?"' --replacement='serverType="SASJS"' ./src/index.html
replace-in-files --regex='"hosturl".*' --replacement='hosturl:"http://localhost:4200",' ./cypress.config.ts
cat ./cypress.config.ts
# Start frontend and run cypress
npm start & npx wait-on http://localhost:4200 && npx cypress run --browser chrome --spec "cypress/e2e/liveness.cy.ts,cypress/e2e/editor.cy.ts,cypress/e2e/excel.cy.ts,cypress/e2e/filtering.cy.ts,cypress/e2e/licensing.cy.ts"
- name: Zip Cypress videos
if: always()
run: |
zip -r cypress-videos ./client/cypress/videos
- name: Cypress videos artifacts
uses: actions/upload-artifact@v3
with:
name: cypress-videos.zip
path: cypress-videos.zip
Build-and-test-development-latest-adapter:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
- name: Write .npmrc file
run: echo "$NPMRC" > client/.npmrc
shell: bash
env:
NPMRC: ${{ secrets.NPMRC}}
- run: apt-get update
- run: wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
- run: apt install -y ./google-chrome*.deb;
- run: export CHROME_BIN=/usr/bin/google-chrome
- run: apt-get update -y
- run: apt-get -y install libgtk2.0-0 libgtk-3-0 libgbm-dev libnotify-dev libgconf-2-4 libnss3 libxss1 libasound2 libxtst6 xauth xvfb
- run: apt -y install jq
- name: Write cypress credentials
run: echo "$CYPRESS_CREDS" > ./client/cypress.env.json
shell: bash
env:
CYPRESS_CREDS: ${{ secrets.CYPRESS_CREDS }}
- name: Install dependencies
run: npm ci
# Install pm2 and prepare SASJS server
- run: npm i -g pm2
- run: curl -L https://github.com/sasjs/server/releases/latest/download/linux.zip > linux.zip
- run: unzip linux.zip
- run: touch .env
- run: echo RUN_TIMES=js >> .env
- run: echo NODE_PATH=node >> .env
- run: echo CORS=enable >> .env
- run: echo WHITELIST=http://localhost:4200 >> .env
- run: cat .env
- run: pm2 start api-linux --wait-ready
- name: Deploy mocked services
run: |
cd ./sas/mocks/sasjs
npm install -g @sasjs/cli
npm install -g replace-in-files-cli
sasjs cbd -t server-ci
- name: Install ZIP
run: |
apt-get update
apt-get install zip
- name: Prepare and run frontend and cypress
run: |
cd ./client
mv ./cypress.env.example.json ./cypress.env.json
replace-in-files --regex='"username".*' --replacement='"username":"'${{ secrets.CYPRESS_USERNAME_SASJS }}'",' ./cypress.env.json
replace-in-files --regex='"password".*' --replacement='"password":"'${{ secrets.CYPRESS_PWD_SASJS }}'" ' ./cypress.env.json
cat ./cypress.env.json
npm run postinstall
npm install @sasjs/adapter@latest
# Prepare index.html to SASJS local
replace-in-files --regex='serverUrl=".*?"' --replacement='serverUrl="http://localhost:5000"' ./src/index.html
replace-in-files --regex='appLoc=".*?"' --replacement='appLoc="/Public/app/devtest"' ./src/index.html
replace-in-files --regex='serverType=".*?"' --replacement='serverType="SASJS"' ./src/index.html
replace-in-files --regex='"hosturl".*' --replacement='hosturl:"http://localhost:4200",' ./cypress.config.ts
cat ./cypress.config.ts
# Start frontend and run cypress
npm start & npx wait-on http://localhost:4200 && npx cypress run --browser chrome --spec "cypress/e2e/liveness.cy.ts,cypress/e2e/editor.cy.ts,cypress/e2e/excel.cy.ts,cypress/e2e/filtering.cy.ts,cypress/e2e/licensing.cy.ts"
- name: Zip Cypress videos
if: always()
run: |
zip -r cypress-videos ./client/cypress/videos
- name: Cypress videos artifacts
uses: actions/upload-artifact@v3
with:
name: cypress-videos-latest-adapter.zip
path: cypress-videos.zip

View File

@ -1,19 +1,157 @@
name: Release name: Release
run-name: Releasing DC run-name: Testing and Releasing DC
on: on:
push: push:
branches: branches:
- main - main
jobs: jobs:
release: Build-production-and-ng-test:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v3 - uses: actions/checkout@v3
- uses: actions/setup-node@v3 - uses: actions/setup-node@v3
with: with:
node-version: 18 node-version: 18
- name: Write .npmrc file
run: |
touch client/.npmrc
echo '${{ secrets.NPMRC}}' > client/.npmrc
- name: Install Chrome for Angular tests
run: |
apt-get update
wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
apt install -y ./google-chrome*.deb;
export CHROME_BIN=/usr/bin/google-chrome
- name: Write cypress credentials
run: echo "$CYPRESS_CREDS" > ./client/cypress.env.json
shell: bash
env:
CYPRESS_CREDS: ${{ secrets.CYPRESS_CREDS }}
- name: Install dependencies
run: npm ci
- name: Check audit
# Audit should fail and stop the CI if critical vulnerability found
run: |
npm audit --audit-level=critical --omit=dev
cd ./sas
npm audit --audit-level=critical --omit=dev
cd ../client
npm audit --audit-level=critical --omit=dev
- name: Angular Tests
run: |
cd client
npm test -- --no-watch --no-progress --browsers=ChromeHeadlessCI
- name: Angular Production Build
run: |
cd client
npm run postinstall
npm run build
Build-and-test-development:
runs-on: ubuntu-latest
needs: Build-production-and-ng-test
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 18
- name: Write .npmrc file
run: |
touch client/.npmrc
echo '${{ secrets.NPMRC}}' > client/.npmrc
- run: apt-get update
- run: wget https://dl.google.com/linux/direct/google-chrome-stable_current_amd64.deb
- run: apt install -y ./google-chrome*.deb;
- run: export CHROME_BIN=/usr/bin/google-chrome
- run: apt-get update -y
- run: apt-get -y install libgtk2.0-0 libgtk-3-0 libgbm-dev libnotify-dev libgconf-2-4 libnss3 libxss1 libasound2 libxtst6 xauth xvfb
- run: apt -y install jq
- name: Write cypress credentials
run: echo "$CYPRESS_CREDS" > ./client/cypress.env.json
shell: bash
env:
CYPRESS_CREDS: ${{ secrets.CYPRESS_CREDS }}
- name: Install dependencies
run: npm ci
# Install pm2 and prepare SASJS server
- run: npm i -g pm2
- run: curl -L https://github.com/sasjs/server/releases/latest/download/linux.zip > linux.zip
- run: unzip linux.zip
- run: touch .env
- run: echo RUN_TIMES=js >> .env
- run: echo NODE_PATH=node >> .env
- run: echo CORS=enable >> .env
- run: echo WHITELIST=http://localhost:4200 >> .env
- run: cat .env
- run: pm2 start api-linux --wait-ready
- name: Deploy mocked services
run: |
cd ./sas/mocks/sasjs
npm install -g @sasjs/cli
npm install -g replace-in-files-cli
sasjs cbd -t server-ci
# sasjs request services/admin/makedata -t server-ci -d ./deploy/makeData4GL.json -c ./deploy/requestConfig.json -o ./output.json
- name: Install ZIP
run: |
apt-get update
apt-get install zip
- name: Prepare and run frontend and cypress
run: |
cd ./client
mv ./cypress.env.example.json ./cypress.env.json
replace-in-files --regex='"username".*' --replacement='"username":"'${{ secrets.CYPRESS_USERNAME_SASJS }}'",' ./cypress.env.json
replace-in-files --regex='"password".*' --replacement='"password":"'${{ secrets.CYPRESS_PWD_SASJS }}'" ' ./cypress.env.json
cat ./cypress.env.json
npm run postinstall
# Prepare index.html to SASJS local
replace-in-files --regex='serverUrl=".*?"' --replacement='serverUrl="http://localhost:5000"' ./src/index.html
replace-in-files --regex='appLoc=".*?"' --replacement='appLoc="/Public/app/devtest"' ./src/index.html
replace-in-files --regex='serverType=".*?"' --replacement='serverType="SASJS"' ./src/index.html
replace-in-files --regex='"hosturl".*' --replacement='hosturl:"http://localhost:4200",' ./cypress.config.ts
cat ./cypress.config.ts
# Start frontend and run cypress
npm start & npx wait-on http://localhost:4200 && npx cypress run --browser chrome --spec "cypress/e2e/liveness.cy.ts,cypress/e2e/editor.cy.ts,cypress/e2e/excel.cy.ts,cypress/e2e/filtering.cy.ts,cypress/e2e/licensing.cy.ts"
- name: Zip Cypress videos
if: always()
run: |
zip -r cypress-videos ./client/cypress/videos
- name: Add cypress videos artifacts
if: always()
uses: actions/upload-artifact@v3
with:
name: cypress-videos.zip
path: cypress-videos.zip
release:
runs-on: ubuntu-latest
needs: [Build-production-and-ng-test, Build-and-test-development]
steps:
- uses: actions/checkout@v3
- uses: actions/setup-node@v3
with:
node-version: 20
- name: Write .npmrc file - name: Write .npmrc file
run: | run: |
echo "$NPMRC" > client/.npmrc echo "$NPMRC" > client/.npmrc
@ -30,11 +168,16 @@ jobs:
npm i -g @sasjs/cli npm i -g @sasjs/cli
# jq is used to parse the release JSON # jq is used to parse the release JSON
apt-get install jq -y apt-get install jq -y
# doxygen is used for the SASJS docs
apt-get update
apt-get install doxygen -y
- name: Create Empty Release (assets are posted later) - name: Create Empty Release (assets are posted later)
run: | run: |
npm i npm i
npm i -g semantic-release npm i -g semantic-release
# We do a semantic-release DRY RUN to make the job fail if there are no changes to release
GITEA_TOKEN=${{ secrets.RELEASE_TOKEN }} GITEA_URL=https://git.datacontroller.io semantic-release --dry-run | grep -q "There are no relevant changes, so no new version is released." && exit 1
GITEA_TOKEN=${{ secrets.RELEASE_TOKEN }} GITEA_URL=https://git.datacontroller.io semantic-release GITEA_TOKEN=${{ secrets.RELEASE_TOKEN }} GITEA_URL=https://git.datacontroller.io semantic-release
- name: Frontend Build - name: Frontend Build
@ -100,12 +243,18 @@ jobs:
npm run compodoc:build npm run compodoc:build
surfer put --token ${{ secrets.TSDOC_TOKEN }} --server webdoc.datacontroller.io documentation/* / surfer put --token ${{ secrets.TSDOC_TOKEN }} --server webdoc.datacontroller.io documentation/* /
- name: Release code.datacontroller.io
run: |
cd sas
sasjs doc
surfer put --token ${{ secrets.CODE_DATACONTROLLER_IO }} --server code.datacontroller.io sasjsbuild/sasdocs/* /
- name: Upload assets to release - name: Upload assets to release
run: | run: |
RELEASE_ID=`curl -k 'https://git.datacontroller.io/api/v1/repos/dc/dc/releases/latest?access_token=${{ secrets.RELEASE_TOKEN }}' | jq -r '.id'` RELEASE_ID=`curl -k 'https://git.datacontroller.io/api/v1/repos/dc/dc/releases/latest?access_token=${{ secrets.RELEASE_TOKEN }}' | jq -r '.id'`
RELEASE_BODY=`curl -k 'https://git.datacontroller.io/api/v1/repos/dc/dc/releases/latest?access_token=${{ secrets.RELEASE_TOKEN }}' | jq -r '.body'` RELEASE_BODY=`curl -k 'https://git.datacontroller.io/api/v1/repos/dc/dc/releases/latest?access_token=${{ secrets.RELEASE_TOKEN }}' | jq -r '.body'`
# Update body # Update body
curl --data '{"draft": true,"body":"'"$RELEASE_BODY\n\nFor installation instructions, please visit https://docs.datacontroller.io/"'"}' -X PATCH --header 'Content-Type: application/json' -k https://git.datacontroller.io/api/v1/repos/dc/dc/releases/$RELEASE_ID?access_token=${{ secrets.RELEASE_TOKEN }} curl --data '{"draft": false,"body":"'"$RELEASE_BODY\n\nFor installation instructions, please visit https://docs.datacontroller.io/"'"}' -X PATCH --header 'Content-Type: application/json' -k https://git.datacontroller.io/api/v1/repos/dc/dc/releases/$RELEASE_ID?access_token=${{ secrets.RELEASE_TOKEN }}
# Upload assets # Upload assets
URL="https://git.datacontroller.io/api/v1/repos/dc/dc/releases/$RELEASE_ID/assets?access_token=${{ secrets.RELEASE_TOKEN }}" URL="https://git.datacontroller.io/api/v1/repos/dc/dc/releases/$RELEASE_ID/assets?access_token=${{ secrets.RELEASE_TOKEN }}"
curl -k $URL -F attachment=@frontend.zip curl -k $URL -F attachment=@frontend.zip

View File

@ -6,11 +6,13 @@
"@semantic-release/commit-analyzer", "@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator", "@semantic-release/release-notes-generator",
"@semantic-release/changelog", "@semantic-release/changelog",
"@semantic-release/npm",
[ [
"@semantic-release/git", "@semantic-release/git",
{ {
"assets": [ "assets": [
"CHANGELOG.md" "CHANGELOG.md",
"package.json"
] ]
} }
], ],

35
.vscode/settings.json vendored
View File

@ -1,18 +1,19 @@
{ {
"cSpell.words": [ "cSpell.words": [
"SYSERRORTEXT", "Licence",
"SYSWARNINGTEXT" "SYSERRORTEXT",
], "SYSWARNINGTEXT",
"editor.rulers": [ "xlmaprules",
80 "xlmaps"
], ],
"files.trimTrailingWhitespace": true, "editor.rulers": [80],
"[markdown]": { "files.trimTrailingWhitespace": true,
"files.trimTrailingWhitespace": false "[markdown]": {
}, "files.trimTrailingWhitespace": false
"workbench.colorCustomizations": { },
"titleBar.activeForeground": "#ebe8e8", "workbench.colorCustomizations": {
"titleBar.activeBackground": "#95ff0053", "titleBar.activeForeground": "#ebe8e8",
}, "titleBar.activeBackground": "#95ff0053"
"terminal.integrated.wordSeparators": " ()[]{}',\"`─‘’" },
} "terminal.integrated.wordSeparators": " ()[]{}',\"`─‘’"
}

View File

@ -1,3 +1,90 @@
# [6.4.0](https://git.datacontroller.io/dc/dc/compare/v6.3.1...v6.4.0) (2024-01-24)
### Bug Fixes
* add dcLib to globals ([5d93346](https://git.datacontroller.io/dc/dc/commit/5d93346b52eda27c2829770e96686a713296d373))
* add service to get xlmap rules and fixed interface name ([9ffa30a](https://git.datacontroller.io/dc/dc/commit/9ffa30ab747f5b62acbd452431a5e6e440afcb80))
* increasing length of mpe_excel_map cols to ([2d4d068](https://git.datacontroller.io/dc/dc/commit/2d4d068413dcdac98581f08939e74bde65b73428))
* providing info on mapids to FE ([fd94945](https://git.datacontroller.io/dc/dc/commit/fd94945466c1a797ddc89815258a65624a9cb0cf))
* removing tables from EDIT menu that are in xlmaps ([9550ae4](https://git.datacontroller.io/dc/dc/commit/9550ae4d1154a0272f8a2427ac9d2afdfd699c96))
* removing XLMAP_TARGETLIBDS from mpe_xlmaps_rules table ([93702c6](https://git.datacontroller.io/dc/dc/commit/93702c63dc280cdba1e46f0fd8fe0deaec879611))
* renaming TABLE macvar to LOAD_REF in postdata.sas ([01915a2](https://git.datacontroller.io/dc/dc/commit/01915a2db9a4dfb94e4e8213e2c32181da36d349))
* reverting xlmap in getdata change ([2d6e747](https://git.datacontroller.io/dc/dc/commit/2d6e747db9b84e9fb0dfcf9102a2f7dd2cb51891))
* update edit tab to load ([516e5a2](https://git.datacontroller.io/dc/dc/commit/516e5a206216f79ab1dce9f4eab0d31115743160))
### Features
* adding ability to define the target table for excel maps ([c86fba9](https://git.datacontroller.io/dc/dc/commit/c86fba9dc75ddc6033132f469ad1c31b9131b12e))
* adding ismap attribute to getdata response (and fixing test) ([2702bb3](https://git.datacontroller.io/dc/dc/commit/2702bb3c84c45903def1aa2b8cc20a6dd080281b))
* Complex Excel Uploads ([cf19381](https://git.datacontroller.io/dc/dc/commit/cf193810606f287b8d6f864c4eb64d43c5ab5f3c)), closes [#69](https://git.datacontroller.io/dc/dc/issues/69)
* Create Tables / Files dropdown under load tab ([b473b19](https://git.datacontroller.io/dc/dc/commit/b473b198a61f468dff74cd8e64692e7847084a80))
* display list of maps in sidebar ([5aec024](https://git.datacontroller.io/dc/dc/commit/5aec0242429942f8a989b5fb79f8d3865e9de01a))
* implemented the logic for xlmap component ([50696bb](https://git.datacontroller.io/dc/dc/commit/50696bb926dd00472db65a008771a4b6352871be))
* model changes for [#69](https://git.datacontroller.io/dc/dc/issues/69) ([271543a](https://git.datacontroller.io/dc/dc/commit/271543a446a2116718f99f0540e3cd911f9f5fe7))
* new getxlmaps service to return rules for a particular xlmap_id ([56264ec](https://git.datacontroller.io/dc/dc/commit/56264ecc6908bf6c8e3e666dfeba7068d6195df8))
* validating the excel map after stage (adding load-ref) ([a485c3b](https://git.datacontroller.io/dc/dc/commit/a485c3b78724a36f7bacb264fb02140cc62d6512))
## [6.3.1](https://git.datacontroller.io/dc/dc/compare/v6.3.0...v6.3.1) (2024-01-01)
### Bug Fixes
* enabling excel uploads to tables with retained keys, also adding more validation to MPE_TABLES updates ([3efccc4](https://git.datacontroller.io/dc/dc/commit/3efccc4cf3752763d049836724f2491c287f65db))
# [6.3.0](https://git.datacontroller.io/dc/dc/compare/v6.2.8...v6.3.0) (2023-12-04)
### Features
* viewer row handle ([dadac4f](https://git.datacontroller.io/dc/dc/commit/dadac4f13f85b5446198b6340cad28844defc94d))
## [6.2.8](https://git.datacontroller.io/dc/dc/compare/v6.2.7...v6.2.8) (2023-12-04)
### Bug Fixes
* bumping sasjs/core to fix mp_loadformat issue ([a1d308e](https://git.datacontroller.io/dc/dc/commit/a1d308ea078786b27bf7ec940d018fc657d4c398))
* new logic for -fc suffix. Closes [#63](https://git.datacontroller.io/dc/dc/issues/63) ([5579db0](https://git.datacontroller.io/dc/dc/commit/5579db0eafc668b1bc310099b7cc3062e0598fc4))
## [6.2.7](https://git.datacontroller.io/dc/dc/compare/v6.2.6...v6.2.7) (2023-11-09)
### Bug Fixes
* **audit:** updated crypto-js (hashing rows in dynamic cell validation) ([a7aa42a](https://git.datacontroller.io/dc/dc/commit/a7aa42a59b71597399924b8d2d06010c806321f3))
* missing dependency and avoiding label length limit issue ([91f128c](https://git.datacontroller.io/dc/dc/commit/91f128c2fead1e4f72267d689e67f49ec9a2ab35))
## [6.2.6](https://git.datacontroller.io/dc/dc/compare/v6.2.5...v6.2.6) (2023-10-18)
### Bug Fixes
* bumping core to address mm_assigndirectlib issue ([c27cdab](https://git.datacontroller.io/dc/dc/commit/c27cdab3fccbde814a29424d0344173a73ea816c))
## [6.2.5](https://git.datacontroller.io/dc/dc/compare/v6.2.4...v6.2.5) (2023-10-17)
### Bug Fixes
* enabling AUTHDOMAIN in MM_ASSIGNDIRECTLIB ([008b45a](https://git.datacontroller.io/dc/dc/commit/008b45ad175ec0e6026f5ef3bc210470226e328f))
## [6.2.4](https://git.datacontroller.io/dc/dc/compare/v6.2.3...v6.2.4) (2023-10-16)
### Bug Fixes
* Enable display of metadata-only tables. Closes [#56](https://git.datacontroller.io/dc/dc/issues/56) ([f3e82b4](https://git.datacontroller.io/dc/dc/commit/f3e82b4ee2a9c1c851f812ac60e9eaf05f91a0f9))
## [6.2.3](https://git.datacontroller.io/dc/dc/compare/v6.2.2...v6.2.3) (2023-10-12)
### Bug Fixes
* bumping core library to avoid non-ascii char in mp_validatecols.sas. [#50](https://git.datacontroller.io/dc/dc/issues/50) ([11b06f6](https://git.datacontroller.io/dc/dc/commit/11b06f6416300b6d70b1570c415d5a5c004976db))
* removing copyright symbol from mpe_alerts macro. [#50](https://git.datacontroller.io/dc/dc/issues/50) ([adb7eb7](https://git.datacontroller.io/dc/dc/commit/adb7eb77550c68a2dab15a6ff358129820e9b612))
## [6.2.2](https://git.datacontroller.io/dc/dc/compare/v6.2.1...v6.2.2) (2023-10-09) ## [6.2.2](https://git.datacontroller.io/dc/dc/compare/v6.2.1...v6.2.2) (2023-10-09)

View File

@ -53,6 +53,17 @@ npm run lint:fix
Typedoc is used for generating typescript documentation based on the code. Typedoc is used for generating typescript documentation based on the code.
That part is automated and beign done as a part of CI job. That part is automated and beign done as a part of CI job.
# Release
Release is automated as a part of CI job. Workflow file: `.gitea/workflows/release.yaml`.
It will run automatically when branch merged to the `main` branch.
IMPORTANT!
If release job fails, after it has been created empty release and a tag, we must not re-run the relase job until we removed the newly create GIT TAG and RELEASE.
To remove the git tag run:
```
git push -d origin vX.X.X
```
To remove the release, you need to do it with repo administration over at [https://git.datacontroller.io/dc/dc](https://git.datacontroller.io/dc/dc)
# Troubleshooting # Troubleshooting
## Makedata service "could not create directory" error ## Makedata service "could not create directory" error

3434
client/package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -51,13 +51,13 @@
"@handsontable/angular": "^13.1.0", "@handsontable/angular": "^13.1.0",
"@sasjs/adapter": "4.10.1", "@sasjs/adapter": "4.10.1",
"@sasjs/utils": "^3.4.0", "@sasjs/utils": "^3.4.0",
"@sheet/crypto": "1.20230414.1", "@sheet/crypto": "1.20211122.1",
"@types/d3-graphviz": "^2.6.7", "@types/d3-graphviz": "^2.6.7",
"@types/text-encoding": "0.0.35", "@types/text-encoding": "0.0.35",
"base64-arraybuffer": "^0.2.0", "base64-arraybuffer": "^0.2.0",
"buffer": "^5.4.3", "buffer": "^5.4.3",
"crypto-browserify": "3.12.0", "crypto-browserify": "3.12.0",
"crypto-js": "^3.3.0", "crypto-js": "^4.2.0",
"d3-graphviz": "^5.0.2", "d3-graphviz": "^5.0.2",
"fs-extra": "^7.0.1", "fs-extra": "^7.0.1",
"handsontable": "^13.1.0", "handsontable": "^13.1.0",
@ -93,7 +93,7 @@
"@compodoc/compodoc": "^1.1.21", "@compodoc/compodoc": "^1.1.21",
"@cypress/webpack-preprocessor": "^5.17.1", "@cypress/webpack-preprocessor": "^5.17.1",
"@types/core-js": "^2.5.5", "@types/core-js": "^2.5.5",
"@types/crypto-js": "^4.0.1", "@types/crypto-js": "^4.2.1",
"@types/es6-shim": "^0.31.39", "@types/es6-shim": "^0.31.39",
"@types/jasmine": "~3.6.0", "@types/jasmine": "~3.6.0",
"@types/lodash-es": "^4.17.3", "@types/lodash-es": "^4.17.3",

View File

@ -37,6 +37,12 @@ export const initFilter: { filter: FilterCache } = {
} }
} }
export interface XLMapListItem {
id: string
description: string
targetDS: string
}
/** /**
* Cached filtering values across whole app (editor, viewer, viewboxes) * Cached filtering values across whole app (editor, viewer, viewboxes)
* Cached lineage libraries, tables * Cached lineage libraries, tables
@ -46,6 +52,8 @@ export const initFilter: { filter: FilterCache } = {
*/ */
export const globals: { export const globals: {
rootParam: string rootParam: string
dcLib: string
xlmaps: XLMapListItem[]
editor: any editor: any
viewer: any viewer: any
viewboxes: ViewboxCache viewboxes: ViewboxCache
@ -57,11 +65,13 @@ export const globals: {
[key: string]: any [key: string]: any
} = { } = {
rootParam: <string>'', rootParam: <string>'',
dcLib: '',
xlmaps: [],
editor: { editor: {
startupSet: <boolean>false, startupSet: <boolean>false,
treeNodeLibraries: <any[] | null>[], treeNodeLibraries: <any[] | null>[],
libsAndTables: <any[]>[], libsAndTables: <any[]>[],
libraries: <String[] | undefined>[], libraries: <string[] | undefined>[],
library: <string>'', library: <string>'',
table: <string>'', table: <string>'',
filter: <FilterCache>{ filter: <FilterCache>{

View File

@ -168,7 +168,7 @@
</button> </button>
<clr-dropdown-menu *clrIfOpen clrPosition="bottom-left"> <clr-dropdown-menu *clrIfOpen clrPosition="bottom-left">
<a [routerLink]="['/view']" clrDropdownItem>VIEW</a> <a [routerLink]="['/view']" clrDropdownItem>VIEW</a>
<a [routerLink]="['/home']" clrDropdownItem>EDIT</a> <a [routerLink]="['/home']" clrDropdownItem>LOAD</a>
<a [routerLink]="['/review/submitted']" clrDropdownItem>REVIEW</a> <a [routerLink]="['/review/submitted']" clrDropdownItem>REVIEW</a>
</clr-dropdown-menu> </clr-dropdown-menu>
</clr-dropdown> </clr-dropdown>
@ -189,7 +189,7 @@
router.url.includes('edit-record') || router.url.includes('edit-record') ||
router.url.includes('home') router.url.includes('home')
" "
>EDIT</a >LOAD</a
> >
<a <a
[routerLink]="['/review/submitted']" [routerLink]="['/review/submitted']"

View File

@ -4,19 +4,19 @@
* The full license information can be found in LICENSE in the root directory of this project. * The full license information can be found in LICENSE in the root directory of this project.
*/ */
import { ModuleWithProviders } from '@angular/core' import { ModuleWithProviders } from '@angular/core'
import { Routes, RouterModule } from '@angular/router' import { RouterModule, Routes } from '@angular/router'
import { HomeComponent } from './home/home.component'
import { NotFoundComponent } from './not-found/not-found.component' import { NotFoundComponent } from './not-found/not-found.component'
import { DeployModule } from './deploy/deploy.module'
import { EditorModule } from './editor/editor.module'
import { HomeModule } from './home/home.module'
import { LicensingModule } from './licensing/licensing.module'
import { ReviewModule } from './review/review.module'
import { ReviewRouteComponent } from './routes/review-route/review-route.component' import { ReviewRouteComponent } from './routes/review-route/review-route.component'
import { StageModule } from './stage/stage.module' import { StageModule } from './stage/stage.module'
import { EditorModule } from './editor/editor.module'
import { ViewerModule } from './viewer/viewer.module'
import { ReviewModule } from './review/review.module'
import { DeployModule } from './deploy/deploy.module'
import { LicensingModule } from './licensing/licensing.module'
import { SystemModule } from './system/system.module' import { SystemModule } from './system/system.module'
import { ViewerModule } from './viewer/viewer.module'
/** /**
* Defining routes * Defining routes
@ -45,7 +45,7 @@ export const ROUTES: Routes = [
path: 'licensing', path: 'licensing',
loadChildren: () => LicensingModule loadChildren: () => LicensingModule
}, },
{ path: 'home', component: HomeComponent }, { path: 'home', loadChildren: () => HomeModule },
{ {
/** /**
* Load editor module with subroutes * Load editor module with subroutes

View File

@ -24,8 +24,8 @@
generatedRecordUrl generatedRecordUrl
? 'copy to clipboard' ? 'copy to clipboard'
: generateEditRecordUrlLoading : generateEditRecordUrlLoading
? 'Generating url...' ? 'Generating url...'
: 'Link to this record' : 'Link to this record'
}} }}
</button> </button>
</ng-container> </ng-container>

View File

@ -280,7 +280,7 @@
licenceState.value.editor_rows_allowed === 1 licenceState.value.editor_rows_allowed === 1
? 'row' ? 'row'
: 'rows' : 'rows'
}}, contact support@datacontroller.io</span }}, contact support&#64;datacontroller.io</span
> >
</clr-tooltip-content> </clr-tooltip-content>
</clr-tooltip> </clr-tooltip>
@ -417,7 +417,7 @@
licenceState.value.editor_rows_allowed === 1 licenceState.value.editor_rows_allowed === 1
? 'row' ? 'row'
: 'rows' : 'rows'
}}, contact support@datacontroller.io</span }}, contact support&#64;datacontroller.io</span
> >
</clr-tooltip-content> </clr-tooltip-content>
</clr-tooltip> </clr-tooltip>
@ -467,7 +467,7 @@
: 'rows' : 'rows'
}} }}
will be submitted. To remove the restriction, contact will be submitted. To remove the restriction, contact
support@datacontroller.io</span support&#64;datacontroller.io</span
> >
<div *ngIf="tableTrue" class="clr-offset-md-2 clr-col-md-8"> <div *ngIf="tableTrue" class="clr-offset-md-2 clr-col-md-8">
<div class="form-group"> <div class="form-group">
@ -528,7 +528,7 @@
Due to current licence, only Due to current licence, only
{{ licenceState.value.submit_rows_limit }} rows in a file will {{ licenceState.value.submit_rows_limit }} rows in a file will
be submitted. To remove the restriction, contact be submitted. To remove the restriction, contact
support@datacontroller.io support&#64;datacontroller.io
</p> </p>
</div> </div>
<div class="modal-footer"> <div class="modal-footer">

View File

@ -38,7 +38,7 @@ import { HotTableInterface } from '../models/HotTable.interface'
import { import {
$DataFormats, $DataFormats,
DSMeta, DSMeta,
EditorsGetdataServiceResponse EditorsGetDataServiceResponse
} from '../models/sas/editors-getdata.model' } from '../models/sas/editors-getdata.model'
import { DataFormat } from '../models/sas/common/DateFormat' import { DataFormat } from '../models/sas/common/DateFormat'
import SheetInfo from '../models/SheetInfo' import SheetInfo from '../models/SheetInfo'
@ -2964,7 +2964,7 @@ export class EditorComponent implements OnInit, AfterViewInit {
await this.sasStoreService await this.sasStoreService
.callService(myParams, 'SASControlTable', 'editors/getdata', this.libds) .callService(myParams, 'SASControlTable', 'editors/getdata', this.libds)
.then((res: EditorsGetdataServiceResponse) => { .then((res: EditorsGetDataServiceResponse) => {
this.initSetup(res) this.initSetup(res)
}) })
.catch((err: any) => { .catch((err: any) => {
@ -2976,7 +2976,7 @@ export class EditorComponent implements OnInit, AfterViewInit {
ngAfterViewInit() {} ngAfterViewInit() {}
initSetup(response: EditorsGetdataServiceResponse) { initSetup(response: EditorsGetDataServiceResponse) {
this.hotInstance = this.hotRegisterer.getInstance('hotInstance') this.hotInstance = this.hotRegisterer.getInstance('hotInstance')
if (this.getdataError) return if (this.getdataError) return

View File

@ -0,0 +1,23 @@
import { NgModule } from '@angular/core'
import { RouterModule, Routes } from '@angular/router'
import { HomeRouteComponent } from '../routes/home-route/home-route.component'
import { HomeComponent } from './home.component'
import { XLMapModule } from '../xlmap/xlmap.module'
const routes: Routes = [
{
path: '',
component: HomeRouteComponent,
children: [
{ path: '', pathMatch: 'full', redirectTo: 'tables' },
{ path: 'tables', component: HomeComponent },
{ path: 'files', loadChildren: () => XLMapModule }
]
}
]
@NgModule({
imports: [RouterModule.forChild(routes)],
exports: [RouterModule]
})
export class HomeRoutingModule {}

View File

@ -100,7 +100,7 @@
*clrIfOpen *clrIfOpen
> >
<span *ngIf="tableLocked"> <span *ngIf="tableLocked">
To unlock all tables, contact support@datacontroller.io To unlock all tables, contact support&#64;datacontroller.io
</span> </span>
</clr-tooltip-content> </clr-tooltip-content>
</clr-tooltip> </clr-tooltip>

View File

@ -1,15 +1,18 @@
import { NgModule } from '@angular/core'
import { CommonModule } from '@angular/common' import { CommonModule } from '@angular/common'
import { HomeComponent } from './home.component' import { NgModule } from '@angular/core'
import { ClarityModule } from '@clr/angular'
import { FormsModule } from '@angular/forms' import { FormsModule } from '@angular/forms'
import { ClarityModule } from '@clr/angular'
import { AppSharedModule } from '../app-shared.module' import { AppSharedModule } from '../app-shared.module'
import { DcTreeModule } from '../shared/dc-tree/dc-tree.module'
import { DirectivesModule } from '../directives/directives.module' import { DirectivesModule } from '../directives/directives.module'
import { HomeRouteComponent } from '../routes/home-route/home-route.component'
import { DcTreeModule } from '../shared/dc-tree/dc-tree.module'
import { HomeRoutingModule } from './home-routing.module'
import { HomeComponent } from './home.component'
@NgModule({ @NgModule({
declarations: [HomeComponent], declarations: [HomeComponent, HomeRouteComponent],
imports: [ imports: [
HomeRoutingModule,
FormsModule, FormsModule,
ClarityModule, ClarityModule,
AppSharedModule, AppSharedModule,

View File

@ -4,12 +4,12 @@ import { DQData, SASParam } from '../TableData'
import { BaseSASResponse } from './common/BaseSASResponse' import { BaseSASResponse } from './common/BaseSASResponse'
import { DataFormat } from './common/DateFormat' import { DataFormat } from './common/DateFormat'
export interface EditorsGetdataServiceResponse { export interface EditorsGetDataServiceResponse {
data: EditorsGetdataSASResponse data: EditorsGetDataSASResponse
libds: string libds: string
} }
export interface EditorsGetdataSASResponse extends BaseSASResponse { export interface EditorsGetDataSASResponse extends BaseSASResponse {
$sasdata: $DataFormats $sasdata: $DataFormats
sasdata: Sasdata[] sasdata: Sasdata[]
sasparams: SASParam[] sasparams: SASParam[]

View File

@ -72,7 +72,7 @@
> >
To unlock more than To unlock more than
{{ licenceState.value.history_rows_allowed }} records, contact {{ licenceState.value.history_rows_allowed }} records, contact
support@datacontroller.io support&#64;datacontroller.io
</p> </p>
</div> </div>

View File

@ -0,0 +1 @@
<router-outlet></router-outlet>

View File

@ -0,0 +1,17 @@
import { Component, OnInit, OnDestroy } from '@angular/core'
@Component({
selector: 'app-home-route',
templateUrl: './home-route.component.html',
styleUrls: ['./home-route.component.scss'],
host: {
class: 'content-container'
}
})
export class HomeRouteComponent implements OnInit, OnDestroy {
constructor() {}
ngOnInit() {}
ngOnDestroy() {}
}

View File

@ -0,0 +1 @@
<router-outlet></router-outlet>

View File

@ -0,0 +1,17 @@
import { Component, OnInit, OnDestroy } from '@angular/core'
@Component({
selector: 'app-xlmap-route',
templateUrl: './xlmap-route.component.html',
styleUrls: ['./xlmap-route.component.scss'],
host: {
class: 'content-container'
}
})
export class XLMapRouteComponent implements OnInit, OnDestroy {
constructor() {}
ngOnInit() {}
ngOnDestroy() {}
}

View File

@ -74,6 +74,7 @@ export class AppService {
missingProps.push('Globvars') missingProps.push('Globvars')
if (!res.sasdatasets) missingProps.push('Sasdatasets') if (!res.sasdatasets) missingProps.push('Sasdatasets')
if (!res.saslibs) missingProps.push('Saslibs') if (!res.saslibs) missingProps.push('Saslibs')
if (!res.xlmaps) missingProps.push('XLMaps')
if (missingProps.length > 0) { if (missingProps.length > 0) {
startupServiceError = true startupServiceError = true
@ -135,10 +136,17 @@ export class AppService {
globals.editor.libsAndTables = libsAndTables globals.editor.libsAndTables = libsAndTables
} }
globals.xlmaps = res.xlmaps.map((xlmap: any) => ({
id: xlmap[0],
description: xlmap[1],
targetDS: xlmap[2]
}))
globals.editor.treeNodeLibraries = treeNodeLibraries globals.editor.treeNodeLibraries = treeNodeLibraries
globals.editor.libraries = libraries globals.editor.libraries = libraries
globals.editor.startupSet = true globals.editor.startupSet = true
globals.dcLib = res.globvars[0].DCLIB
await this.licenceService.activation(res) await this.licenceService.activation(res)
}) })
.catch((err: any) => { .catch((err: any) => {

View File

@ -10,8 +10,8 @@ import { globals } from '../_globals'
import { FilterClause, FilterGroup, FilterQuery } from '../models/FilterQuery' import { FilterClause, FilterGroup, FilterQuery } from '../models/FilterQuery'
import { import {
$DataFormats, $DataFormats,
EditorsGetdataSASResponse, EditorsGetDataSASResponse,
EditorsGetdataServiceResponse EditorsGetDataServiceResponse
} from '../models/sas/editors-getdata.model' } from '../models/sas/editors-getdata.model'
import { LoggerService } from './logger.service' import { LoggerService } from './logger.service'
import { isSpecialMissing } from '@sasjs/utils/input/validators' import { isSpecialMissing } from '@sasjs/utils/input/validators'
@ -57,13 +57,13 @@ export class SasStoreService {
libds: string libds: string
) { ) {
this.libds = libds this.libds = libds
let tables: any = {} const tables: any = {}
tables[tableName] = [tableData] tables[tableName] = [tableData]
let res: EditorsGetdataSASResponse = await this.sasService.request( const res: EditorsGetDataSASResponse = await this.sasService.request(
program, program,
tables tables
) )
let response: EditorsGetdataServiceResponse = { const response: EditorsGetDataServiceResponse = {
data: res, data: res,
libds: this.libds libds: this.libds
} }
@ -209,6 +209,14 @@ export class SasStoreService {
return res return res
} }
public async getXLMapRules(id: string) {
const tables = {
getxlmaps_in: [{ XLMAP_ID: id }]
}
const res: any = await this.sasService.request('editors/getxlmaps', tables)
return res
}
public async getDetails(tableData: any, tableName: string, program: string) { public async getDetails(tableData: any, tableName: string, program: string) {
let tables: any = {} let tables: any = {}
tables[tableName] = [tableData] tables[tableName] = [tableData]

View File

@ -2,5 +2,5 @@
[ngClass]="classes" [ngClass]="classes"
[class.unset]="classes !== ''" [class.unset]="classes !== ''"
href="mailto:support@datacontroller.io?subject=Licence" href="mailto:support@datacontroller.io?subject=Licence"
>support@datacontroller.io</a >support&#64;datacontroller.io</a
> >

View File

@ -106,7 +106,7 @@
*clrIfOpen *clrIfOpen
> >
<span *ngIf="tableLocked"> <span *ngIf="tableLocked">
To unlock all tables, contact support@datacontroller.io To unlock all tables, contact support&#64;datacontroller.io
</span> </span>
</clr-tooltip-content> </clr-tooltip-content>

View File

@ -107,7 +107,29 @@
</clr-tab-content> </clr-tab-content>
</clr-tab> </clr-tab>
</clr-tabs> </clr-tabs>
<p *ngIf="isMainRoute('home')" class="page-title">Edit</p>
<div
*ngIf="isMainRoute('home')"
class="d-flex justify-content-center sub-dropdown"
>
<clr-dropdown>
<button class="dropdown-toggle btn btn-link" clrDropdownTrigger>
{{ getSubPage() }}
<clr-icon shape="caret down"></clr-icon>
</button>
<clr-dropdown-menu *clrIfOpen>
<a
clrVerticalNavLink
routerLink="/home/tables"
routerLinkActive="active"
>Tables</a
>
<a clrVerticalNavLink routerLink="/home/files" routerLinkActive="active"
>Files</a
>
</clr-dropdown-menu>
</clr-dropdown>
</div>
<div class="nav-divider"></div> <div class="nav-divider"></div>

View File

@ -13,7 +13,7 @@
class="licence-notice" class="licence-notice"
>To unlock more then {{ licenceState.value.viewbox_limit }} >To unlock more then {{ licenceState.value.viewbox_limit }}
{{ licenceState.value.viewbox_limit === 1 ? 'viewbox' : 'viewboxes' }}, {{ licenceState.value.viewbox_limit === 1 ? 'viewbox' : 'viewboxes' }},
contact support@datacontroller.io</span contact support&#64;datacontroller.io</span
> >
</h3> </h3>

View File

@ -7,6 +7,7 @@ import { EventService } from '../services/event.service'
import { AppService } from '../services/app.service' import { AppService } from '../services/app.service'
import { HotTableInterface } from '../models/HotTable.interface' import { HotTableInterface } from '../models/HotTable.interface'
import { LicenceService } from '../services/licence.service' import { LicenceService } from '../services/licence.service'
import { globals } from '../_globals'
@Component({ @Component({
selector: 'app-stage', selector: 'app-stage',
@ -55,7 +56,15 @@ export class StageComponent implements OnInit {
} }
public goBack() { public goBack() {
this.route.navigateByUrl('/editor/' + this.tableDetails.BASE_TABLE) const xlmap = globals.xlmaps.find(
(xlmap) => xlmap.targetDS === this.tableDetails.BASE_TABLE
)
if (xlmap) {
const id = this.hotTable.data[0].XLMAP_ID
this.route.navigateByUrl('/home/files/' + id)
} else {
this.route.navigateByUrl('/editor/' + this.tableDetails.BASE_TABLE)
}
} }
public download(id: any) { public download(id: any) {

View File

@ -105,7 +105,7 @@
*clrIfOpen *clrIfOpen
> >
<span *ngIf="tableLocked"> <span *ngIf="tableLocked">
To unlock all tables, contact support@datacontroller.io To unlock all tables, contact support&#64;datacontroller.io
</span> </span>
</clr-tooltip-content> </clr-tooltip-content>
</clr-tooltip> </clr-tooltip>
@ -630,6 +630,9 @@
[cells]="hotTable.cells" [cells]="hotTable.cells"
[maxRows]="hotTable.maxRows" [maxRows]="hotTable.maxRows"
[manualColumnResize]="true" [manualColumnResize]="true"
[rowHeaders]="hotTable.rowHeaders"
[rowHeaderWidth]="hotTable.rowHeaderWidth"
[rowHeights]="hotTable.rowHeights"
[licenseKey]="hotTable.licenseKey" [licenseKey]="hotTable.licenseKey"
> >
</hot-table> </hot-table>

View File

@ -108,6 +108,11 @@ export class ViewerComponent implements AfterContentInit, AfterViewInit {
settings: {}, settings: {},
afterGetColHeader: undefined, afterGetColHeader: undefined,
licenseKey: undefined, licenseKey: undefined,
rowHeaders: (index: number) => {
return ' '
},
rowHeaderWidth: 15,
rowHeights: 20,
contextMenu: ['copy_with_column_headers', 'copy_column_headers_only'], contextMenu: ['copy_with_column_headers', 'copy_column_headers_only'],
copyPaste: { copyPaste: {
copyColumnHeaders: true, copyColumnHeaders: true,

View File

@ -0,0 +1,159 @@
import {
extractRowAndCol,
getCellAddress,
getFinishingCell,
isBlankRow
} from '../utils/xl.utils'
describe('isBlankRow', () => {
it('should return true for a blank row', () => {
const blankRow = { __rowNum__: 1 }
expect(isBlankRow(blankRow)).toBeTrue()
})
it('should return false for a non-blank row', () => {
const nonBlankRow = {
B: 3,
C: 'some value',
D: -203
}
expect(isBlankRow(nonBlankRow)).toBeFalse()
})
})
describe('extractRowAndCol', () => {
it('should extract row and column from "MATCH F R[2]C[0]: CASH BALANCE"', () => {
const input = 'MATCH F R[2]C[0]: CASH BALANCE'
const result = extractRowAndCol(input)
expect(result).toEqual({ row: 2, column: 0 })
})
it('should extract row and column from "RELATIVE R[10]C[6]"', () => {
const input = 'RELATIVE R[10]C[6]'
const result = extractRowAndCol(input)
expect(result).toEqual({ row: 10, column: 6 })
})
it('should return null for invalid input', () => {
const invalidInput = 'INVALID INPUT'
const result = extractRowAndCol(invalidInput)
expect(result).toBeNull()
})
})
describe('getCellAddress', () => {
const arrayOfObjects = [
{ A: 'valueA1', B: 'valueB1' },
{ A: 'valueA2', B: 'valueB2' }
]
it('should convert "ABSOLUTE D8" to A1-style address', () => {
const input = 'ABSOLUTE D8'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('D8')
})
it('should convert "RELATIVE R[10]C[6]" to A1-style address', () => {
const input = 'RELATIVE R[10]C[6]'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('F10')
})
it('should convert "MATCH 1 R[0]C[0]:valueA1" to A1-style address', () => {
const input = 'MATCH 1 R[0]C[0]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('A1')
})
it('should convert "MATCH A R[0]C[0]:valueA1" to A1-style address', () => {
const input = 'MATCH A R[0]C[0]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('A1')
})
it('should convert "MATCH 1 R[1]C[0]:valueA1" to A1-style address', () => {
const input = 'MATCH 1 R[1]C[0]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('A2')
})
it('should convert "MATCH A R[0]C[1]:valueA1" to A1-style address', () => {
const input = 'MATCH A R[0]C[1]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('B1')
})
it('should convert "MATCH 1 R[1]C[1]:valueA1" to A1-style address', () => {
const input = 'MATCH 1 R[1]C[1]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('B2')
})
it('should convert "MATCH A R[1]C[1]:valueA1" to A1-style address', () => {
const input = 'MATCH A R[1]C[1]:valueA1'
const result = getCellAddress(input, arrayOfObjects)
expect(result).toBe('B2')
})
})
describe('getFinishingCell', () => {
const arrayOfObjects = [
{ A: 'valueA1', B: 'valueB1' },
{ A: 'valueA2', B: 'valueB2' },
{ A: 'valueA3', B: 'valueB3' },
{ B: 'valueB4' },
{ A: 'valueA5' },
{ A: 'valueA6', B: 'valueB6' },
{},
{ A: 'valueA8' }
]
it('should return the start cell if finish is an empty string', () => {
const start = 'A1'
const finish = ''
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe(start)
})
it('should convert "ABSOLUTE D8" to A1-style address', () => {
const start = 'A1'
const finish = 'ABSOLUTE D8'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('D8')
})
it('should convert "RELATIVE R[2]C[1]" to A1-style address', () => {
const start = 'A1'
const finish = 'RELATIVE R[2]C[1]'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('B3')
})
it('should convert "MATCH A R[0]C[1]:valueA1" to A1-style address', () => {
const start = 'A1'
const finish = 'MATCH A R[0]C[1]:valueA1'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('B1')
})
it('should convert "MATCH 1 R[4]C[0]:valueB1" to A1-style address', () => {
const start = 'A1'
const finish = 'MATCH 1 R[4]C[0]:valueB1'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('B5')
})
it('should convert "LASTDOWN" to A1-style address of the last non-blank cell in column A', () => {
const start = 'A1'
const finish = 'LASTDOWN'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('A3')
})
it('should convert "BLANKROW" to A1-style address of the last row with blank cells', () => {
const start = 'A1'
const finish = 'BLANKROW'
const result = getFinishingCell(start, finish, arrayOfObjects)
expect(result).toBe('B6')
})
})

View File

@ -0,0 +1,31 @@
export const blobToFile = (blob: Blob, fileName: string): File => {
const file = new File([blob], fileName, {
lastModified: new Date().getTime()
})
return file
}
/**
* Convert an array of bytes (Uint8Array) to a binary string.
* @param {Uint8Array} res - The array of bytes to convert.
* @returns {string} The binary string representation of the array of bytes.
*/
export const byteArrayToBinaryString = (res: Uint8Array): string => {
// Create a Uint8Array from the input array (if it's not already)
const bytes = new Uint8Array(res)
// Initialize an empty string to store the binary representation
let binary = ''
// Get the length of the byte array
const length = bytes.byteLength
// Iterate through each byte in the array
for (let i = 0; i < length; i++) {
// Convert each byte to its binary representation and append to the string
binary += String.fromCharCode(bytes[i])
}
// Return the binary string
return binary
}

View File

@ -0,0 +1,225 @@
import * as XLSX from '@sheet/crypto'
/**
* Checks if an excel row is blank or not
*
* @param row object is of shape {[key: string]: any}
*/
export const isBlankRow = (row: any) => {
for (const key in row) {
if (key !== '__rowNum__') {
return false
}
}
return true
}
/**
* Extracts row and column number from xlmap rule.
*
* Input string should be in form of
* either "MATCH F R[2]C[0]: CASH BALANCE" or "RELATIVE R[10]C[6]"
*/
export const extractRowAndCol = (str: string) => {
// Regular expression to match and capture the values inside square brackets
const regex = /R\[(\d+)\]C\[(\d+)\]/
// Match the regular expression against the input string
const match = str.match(regex)
if (!match) {
return null
}
// Extract values from the match groups
const row = parseInt(match[1], 10)
const column = parseInt(match[2], 10)
return {
row,
column
}
}
/**
* Generate an A1-Style excel cell address from xlmap rule.
*
* Expect "ABSOLUTE D8" or "RELATIVE R[10]C[6]" or
* "MATCH C R[0]C[4]:Common Equity Tier 1 (CET1)" kinds of string as rule input
*/
export const getCellAddress = (rule: string, arrayOfObjects: any[]) => {
if (rule.startsWith('ABSOLUTE ')) {
rule = rule.replace('ABSOLUTE ', '')
}
if (rule.startsWith('RELATIVE ')) {
const rowAndCol = extractRowAndCol(rule)
if (rowAndCol) {
const { row, column } = rowAndCol
// Generate an A1-Style address string from a SheetJS cell address
// Spreadsheet applications typically display ordinal row numbers,
// where 1 is the first row, 2 is the second row, etc. The numbering starts at 1.
// SheetJS follows JavaScript counting conventions,
// where 0 is the first row, 1 is the second row, etc. The numbering starts at 0.
// Therefore, we have to subtract 1 from row and column to match SheetJS indexing convention
rule = XLSX.utils.encode_cell({ r: row - 1, c: column - 1 })
}
}
if (rule.startsWith('MATCH ')) {
let targetValue = ''
// using a regular expression to match "C[x]:" and extract the value after it
const match = rule.match(/C\[\d+\]:(.+)/)
// Check if there is a match
if (match) {
// Extract the value after "C[x]:"
targetValue = match[1]
}
// Split the string by spaces to get target row/column
const splittedArray = rule.split(' ')
// Extract the second word
const secondWord = splittedArray[1]
let targetColumn = ''
let targetRow = -1
let cellAddress = ''
// Check if the secondWord is a number
if (!isNaN(Number(secondWord))) {
targetRow = parseInt(secondWord)
} else {
targetColumn = secondWord
}
if (targetRow !== -1) {
// sheetJS index starts from 0,
// therefore, decremented 1 to make it correct row address for js array
const row = arrayOfObjects[targetRow - 1]
for (const col in row) {
if (col !== '__rowNum__' && row[col] === targetValue) {
cellAddress = col + targetRow
break
}
}
} else {
for (let i = 0; i < arrayOfObjects.length; i++) {
const row = arrayOfObjects[i]
if (row[targetColumn] === targetValue) {
// sheetJS index starts from 0,
// therefore, incremented 1 to make it correct row address
const rowIndex = i + 1
cellAddress = targetColumn + rowIndex
break
}
}
}
// Converts A1 cell address to 0-indexed form
const matchedCellAddress = XLSX.utils.decode_cell(cellAddress)
// extract number of rows and columns that we have to move
// from matched cell to reach target cell
const rowAndCol = extractRowAndCol(rule)
if (rowAndCol) {
const { row, column } = rowAndCol
// Converts 0-indexed cell address to A1 form
rule = XLSX.utils.encode_cell({
r: matchedCellAddress.r + row,
c: matchedCellAddress.c + column
})
}
}
return rule
}
/**
* Generate an A1-Style excel cell address for last cell
*
* @param start A1 style excel cell address
* @param finish XLMAP_FINISH attribute of xlmap rule
* @param arrayOfObjects an array of row objects
* @returns
*/
export const getFinishingCell = (
start: string,
finish: string,
arrayOfObjects: any[]
) => {
// in this case an individual cell would be extracted
if (finish === '') {
return start
}
if (finish.startsWith('ABSOLUTE ')) {
finish = finish.replace('ABSOLUTE ', '')
}
if (finish.startsWith('RELATIVE ')) {
const rowAndCol = extractRowAndCol(finish)
if (rowAndCol) {
const { row, column } = rowAndCol
const { r, c } = XLSX.utils.decode_cell(start)
// finish is relative to starting point
// therefore, we need to add extracted row and columns
// in starting cell address to get actual finishing cell
finish = XLSX.utils.encode_cell({ r: r + row, c: c + column })
}
}
if (finish.startsWith('MATCH ')) {
finish = getCellAddress(finish, arrayOfObjects)
}
if (finish === 'LASTDOWN') {
const { r, c } = XLSX.utils.decode_cell(start)
const colName = XLSX.utils.encode_col(c)
let lastNonBlank = r
for (let i = r + 1; i < arrayOfObjects.length; i++) {
const row = arrayOfObjects[i]
if (!row[colName]) {
break
}
lastNonBlank = i
}
finish = colName + (lastNonBlank + 1) // excel numbering starts from 1. So incremented 1 to 0 based index
}
if (finish === 'BLANKROW') {
const { r } = XLSX.utils.decode_cell(start)
let lastNonBlankRow = r
for (let i = r + 1; i < arrayOfObjects.length; i++) {
const row = arrayOfObjects[i]
if (isBlankRow(row)) {
break
}
lastNonBlankRow = i
}
const row = arrayOfObjects[lastNonBlankRow]
// Get the keys of the object (excluding '__rowNum__')
const keys = Object.keys(row).filter((key) => key !== '__rowNum__')
// Finding last column in a row
// Find the key with the highest alphanumeric value (assumes keys are letters)
const lastColumn = keys.reduce(
(maxKey, currentKey) => (currentKey > maxKey ? currentKey : maxKey),
''
)
// make finishing cell address in A1 style
finish = lastColumn + (lastNonBlankRow + 1) // excel numbering starts from 1. So incremented 1 to 0 based index
}
return finish
}

View File

@ -0,0 +1,22 @@
import { NgModule } from '@angular/core'
import { RouterModule, Routes } from '@angular/router'
import { XLMapComponent } from '../xlmap/xlmap.component'
import { XLMapRouteComponent } from '../routes/xlmap-route/xlmap-route.component'
const routes: Routes = [
{
path: '',
component: XLMapRouteComponent,
children: [
{ path: '', component: XLMapComponent },
{ path: ':id', component: XLMapComponent }
]
}
]
@NgModule({
imports: [RouterModule.forChild(routes)],
exports: [RouterModule]
})
export class XLMapRoutingModule {}

View File

@ -0,0 +1,252 @@
<app-sidebar>
<div *ngIf="xlmapsLoading" class="my-10-mx-auto text-center">
<clr-spinner clrMedium></clr-spinner>
</div>
<clr-tree>
<clr-tree-node class="search-node">
<div class="tree-search-wrapper">
<input
clrInput
#searchXLMapTreeInput
placeholder="Filter by Id"
name="input"
[(ngModel)]="searchString"
(keyup)="xlmapListOnFilter()"
autocomplete="off"
/>
<clr-icon
*ngIf="searchXLMapTreeInput.value.length < 1"
shape="search"
></clr-icon>
<clr-icon
*ngIf="searchXLMapTreeInput.value.length > 0"
(click)="searchString = ''; xlmapListOnFilter()"
shape="times"
></clr-icon>
</div>
</clr-tree-node>
<ng-container *ngFor="let xlmap of xlmaps">
<clr-tree-node>
<button
(click)="xlmapOnClick(xlmap)"
class="clr-treenode-link"
[class.table-active]="isActiveXLMap(xlmap.id)"
>
<clr-icon shape="file"></clr-icon>
{{ xlmap.id }}
</button>
</clr-tree-node>
</ng-container>
</clr-tree>
</app-sidebar>
<div class="content-area">
<div *ngIf="!selectedXLMap" class="no-table-selected">
<clr-icon
shape="warning-standard"
size="60"
class="is-info icon-dc-fill"
></clr-icon>
<h3 *ngIf="xlmaps.length > 0" class="text-center color-gray">
Please select a map
</h3>
<h3 *ngIf="xlmaps.length < 1" class="text-center color-gray">
No excel map is found
</h3>
</div>
<div class="loadingSpinner" *ngIf="isLoading">
<span class="spinner"> Loading... </span>
<div>
<h4>{{ isLoadingDesc }}</h4>
</div>
</div>
<div
appDragNdrop
(fileDraggedOver)="onShowUploadModal()"
class="card h-100 d-flex clr-flex-column"
*ngIf="!isLoading && selectedXLMap"
>
<clr-tabs>
<clr-tab>
<button clrTabLink (click)="selectedTab = TabsEnum.Rules">Rules</button>
<clr-tab-content *clrIfActive="selectedTab === TabsEnum.Rules">
</clr-tab-content>
</clr-tab>
<clr-tab>
<button clrTabLink (click)="selectedTab = TabsEnum.Data">Data</button>
<clr-tab-content *clrIfActive="selectedTab === TabsEnum.Data">
</clr-tab-content>
</clr-tab>
</clr-tabs>
<ng-container *ngTemplateOutlet="actionButtons"></ng-container>
<div class="clr-row m-0 mb-10-i viewerTitle">
<h3 class="d-flex clr-col-12 clr-justify-content-center mt-5-i">
{{ selectedXLMap.id }}
</h3>
<i class="d-flex clr-col-12 clr-justify-content-center mt-5-i">{{
selectedXLMap.description
}}</i>
<h5 class="d-flex clr-col-12 clr-justify-content-center mt-5-i">
Rules Source:
<a class="ml-10" [routerLink]="'/view/data/' + rulesSource">
{{ rulesSource }}
</a>
</h5>
<h5 class="d-flex clr-col-12 clr-justify-content-center mt-5-i">
Target dataset:
<a class="ml-10" [routerLink]="'/view/data/' + selectedXLMap.targetDS">
{{ selectedXLMap.targetDS }}
</a>
</h5>
</div>
<div class="clr-flex-1">
<hot-table
hotId="hotInstance"
id="hot-table"
[multiColumnSorting]="true"
[viewportRowRenderingOffset]="50"
[data]="selectedTab === TabsEnum.Rules ? xlmapRules : xlData"
[colHeaders]="
selectedTab === TabsEnum.Rules ? xlmapRulesHeaders : xlUploadHeader
"
[columns]="
selectedTab === TabsEnum.Rules ? xlmapRulesColumns : xlUploadColumns
"
[filters]="true"
[height]="'100%'"
stretchH="all"
[modifyColWidth]="maxWidthChecker"
[cells]="getCellConfiguration"
[maxRows]="hotTableMaxRows"
[manualColumnResize]="true"
[rowHeaders]="rowHeaders"
[rowHeaderWidth]="15"
[rowHeights]="20"
[licenseKey]="hotTableLicenseKey"
>
</hot-table>
</div>
</div>
<clr-modal
appFileDrop
(fileOver)="fileOverBase($event)"
(fileDrop)="getFileDesc($event, true)"
[uploader]="uploader"
[clrModalSize]="'xl'"
[clrModalStaticBackdrop]="false"
[clrModalClosable]="true"
[(clrModalOpen)]="showUploadModal"
class="relative"
>
<h3 class="modal-title">Upload File</h3>
<div class="modal-body">
<div class="drop-area">
<span>Drop file anywhere to upload!</span>
</div>
<div class="clr-col-md-12">
<div class="clr-row card-block mt-15 d-flex justify-content-between">
<div class="clr-col-md-3 filterBtn">
<span class="filterBtn w-100">
<label
for="file-upload"
class="btn btn-sm btn-outline profile-buttons w-100"
>
Browse
</label>
</span>
<input
hidden
#fileUploadInput
id="file-upload"
type="file"
appFileSelect
[uploader]="uploader"
(change)="getFileDesc($event)"
/>
</div>
</div>
</div>
</div>
</clr-modal>
<clr-modal [(clrModalOpen)]="submitLimitNotice">
<h3 class="modal-title">Notice</h3>
<div class="modal-body">
<p class="m-0">
Due to current licence, only
{{ licenceState.value.submit_rows_limit }} rows in a file will be
submitted. To remove the restriction, contact
support&#64;datacontroller.io
</p>
</div>
<div class="modal-footer">
<button
type="button"
class="btn btn-sm btn-primary"
(click)="submitLimitNotice = false"
>
Cancel
</button>
<button
type="button"
class="btn btn-sm btn-primary"
(click)="submit(); submitLimitNotice = false"
>
Submit
</button>
</div>
</clr-modal>
</div>
<ng-template #actionButtons>
<div class="clr-row m-0 clr-justify-content-center">
<div
*ngIf="status === StatusEnum.ReadyToUpload"
class="d-flex clr-justify-content-center clr-col-12 clr-col-lg-4"
>
<button
type="button"
class="btn btn-sm btn-success btn-block mr-0"
(click)="onShowUploadModal()"
>
<clr-icon shape="upload"></clr-icon>
<span>Upload</span>
</button>
</div>
<div
*ngIf="status === StatusEnum.ReadyToSubmit"
class="d-flex clr-justify-content-center clr-col-12 clr-col-lg-4"
>
<button
type="button"
class="btn btn-sm btn-success btn-block mr-0"
(click)="submitExcel()"
>
<clr-icon shape="upload"></clr-icon>
<span>Submit</span>
</button>
</div>
<div
*ngIf="status === StatusEnum.ReadyToSubmit"
class="d-flex clr-justify-content-center clr-col-12 clr-col-lg-4"
>
<button
type="button"
class="btn btn-sm btn-outline-danger btn-block mr-0"
(click)="discardExtractedData()"
>
<clr-icon shape="times"></clr-icon>
<span>Discard</span>
</button>
</div>
</div>
</ng-template>

View File

@ -0,0 +1,77 @@
.card {
margin-top: 0;
flex: 1;
display: flex;
flex-direction: column;
}
clr-tree-node button {
white-space: nowrap;
}
.no-table-selected {
position: relative;
}
.header-row {
.title-col {
display: flex;
align-items: center;
}
.options-col {
display: flex;
justify-content: flex-end;
}
}
.sw {
margin: 1rem 0rem 0.5rem 1rem;
}
.viewerTitle {
text-align: center;
}
.cardFlex {
display: flex;
justify-content: center;
}
.content-area {
padding: 0.5rem !important;
display: flex;
flex-direction: column;
}
hot-table {
::ng-deep {
.primaryKeyHeaderStyle {
background: #306b006e;
}
}
}
.drop-area {
position: fixed;
top: 0;
left: 0;
bottom: 0;
right: 0;
display: flex;
justify-content: center;
margin: 1px;
border: 2px dashed #fff;
z-index: -1;
span {
font-size: 20px;
margin-top: 20px;
color: #fff;
}
}

View File

@ -0,0 +1,477 @@
import {
AfterContentInit,
AfterViewInit,
Component,
ElementRef,
HostBinding,
OnInit,
QueryList,
ViewChildren
} from '@angular/core'
import { ActivatedRoute, Router } from '@angular/router'
import { UploadFile } from '@sasjs/adapter'
import * as XLSX from '@sheet/crypto'
import { XLMapListItem, globals } from '../_globals'
import { FileUploader } from '../models/FileUploader.class'
import {
EventService,
LicenceService,
LoggerService,
SasService,
SasStoreService
} from '../services'
import { getCellAddress, getFinishingCell } from './utils/xl.utils'
import { blobToFile, byteArrayToBinaryString } from './utils/file.utils'
interface XLMapRule {
XLMAP_ID: string
XLMAP_SHEET: string
XLMAP_RANGE_ID: string
XLMAP_START: string
XLMAP_FINISH: string
}
interface XLUploadEntry {
LOAD_REF: string
XLMAP_ID: string
XLMAP_RANGE_ID: string
ROW_NO: number
COL_NO: number
VALUE_TXT: string
}
enum Status {
NoMapSelected,
FetchingRules,
ReadyToUpload,
ExtractingData,
ReadyToSubmit,
SubmittingExtractedData,
Submitting
}
enum Tabs {
Rules,
Data
}
@Component({
selector: 'app-xlmap',
templateUrl: './xlmap.component.html',
styleUrls: ['./xlmap.component.scss']
})
export class XLMapComponent implements AfterContentInit, AfterViewInit, OnInit {
@HostBinding('class.content-container') contentContainerClass = true
@ViewChildren('fileUploadInput')
fileUploadInputCompList: QueryList<ElementRef> = new QueryList()
StatusEnum = Status
TabsEnum = Tabs
public selectedTab = Tabs.Rules
public rulesSource = globals.dcLib + '.MPE_XLMAP_RULES'
public xlmaps: XLMapListItem[] = []
public selectedXLMap: XLMapListItem | undefined = undefined
public searchString = ''
public xlmapsLoading = true
public isLoading = false
public isLoadingDesc = ''
public status = Status.NoMapSelected
public xlmapRulesHeaders = [
'XLMAP_SHEET',
'XLMAP_RANGE_ID',
'XLMAP_START',
'XLMAP_FINISH'
]
public xlmapRulesColumns = [
{
data: 'XLMAP_SHEET'
},
{
data: 'XLMAP_RANGE_ID'
},
{
data: 'XLMAP_START'
},
{
data: 'XLMAP_FINISH'
}
]
public xlmapRules: XLMapRule[] = []
public xlUploadHeader = ['XLMAP_RANGE_ID', 'ROW_NO', 'COL_NO', 'VALUE_TXT']
public xlUploadColumns = [
{
data: 'XLMAP_RANGE_ID'
},
{
data: 'ROW_NO'
},
{
data: 'COL_NO'
},
{
data: 'VALUE_TXT'
}
]
public xlData: XLUploadEntry[] = []
public showUploadModal = false
public hasBaseDropZoneOver = false
public filename = ''
public submitLimitNotice = false
public uploader: FileUploader = new FileUploader()
public licenceState = this.licenceService.licenceState
public hotTableLicenseKey: string | undefined = undefined
public hotTableMaxRows =
this.licenceState.value.viewer_rows_allowed || Infinity
constructor(
private eventService: EventService,
private licenceService: LicenceService,
private loggerService: LoggerService,
private route: ActivatedRoute,
private router: Router,
private sasStoreService: SasStoreService,
private sasService: SasService
) {}
public xlmapOnClick(xlmap: XLMapListItem) {
if (xlmap.id !== this.selectedXLMap?.id) {
this.selectedXLMap = xlmap
this.viewXLMapRules()
this.router.navigateByUrl('/home/files/' + xlmap.id)
}
}
public xlmapListOnFilter() {
if (this.searchString.length > 0) {
const array: XLMapListItem[] = globals.xlmaps
this.xlmaps = array.filter((item) =>
item.id.toLowerCase().includes(this.searchString.toLowerCase())
)
} else {
this.xlmaps = globals.xlmaps
}
}
public isActiveXLMap(id: string) {
return this.selectedXLMap?.id === id
}
public maxWidthChecker(width: any, col: any) {
if (width > 200) return 200
else return width
}
public getCellConfiguration() {
return { readOnly: true }
}
public rowHeaders() {
return ' '
}
public onShowUploadModal() {
this.showUploadModal = true
}
/**
* Called by FileDropDirective
* @param e true if file is dragged over the drop zone
*/
public fileOverBase(e: boolean): void {
this.hasBaseDropZoneOver = e
}
public getFileDesc(event: any, dropped = false) {
const file = dropped ? event[0] : event.target.files[0]
if (!file) return
const filename = file.name
this.filename = filename
const fileType = filename.slice(
filename.lastIndexOf('.') + 1,
filename.lastIndexOf('.') + 4
)
if (fileType.toLowerCase() === 'xls') {
this.showUploadModal = false
this.isLoading = true
this.isLoadingDesc = 'Extracting Data'
this.status = Status.ExtractingData
const reader = new FileReader()
reader.onload = async (theFile: any) => {
/* read workbook */
const bstr = byteArrayToBinaryString(theFile.target.result)
let wb: XLSX.WorkBook | undefined = undefined
const xlsxOptions: XLSX.ParsingOptions = {
type: 'binary',
cellDates: false,
cellFormula: true,
cellStyles: true,
cellNF: false,
cellText: false
}
try {
wb = XLSX.read(bstr, {
...xlsxOptions
})
} catch (err: any) {
this.eventService.showAbortModal(
null,
err,
undefined,
'Error reading file'
)
}
if (!wb) {
this.isLoading = false
this.isLoadingDesc = ''
this.status = Status.ReadyToUpload
this.uploader.queue.pop()
return
}
this.extractData(wb)
return
}
reader.readAsArrayBuffer(file)
} else {
this.isLoading = false
this.isLoadingDesc = ''
this.status = Status.ReadyToUpload
this.showUploadModal = true
this.uploader.queue.pop()
const abortMsg =
'Invalid file type "<b>' +
this.filename +
'</b>". Please upload excel file.'
this.eventService.showAbortModal(null, abortMsg)
}
}
public discardExtractedData() {
this.isLoading = false
this.isLoadingDesc = ''
this.status = Status.ReadyToUpload
this.xlData = []
this.filename = ''
this.uploader.queue = []
if (this.fileUploadInputCompList.first) {
this.fileUploadInputCompList.first.nativeElement.value = ''
}
}
/**
* Submits attached excel file that is in preview mode
*/
public submitExcel() {
if (this.licenceState.value.submit_rows_limit !== Infinity) {
this.submitLimitNotice = true
return
}
this.submit()
}
public submit() {
if (!this.selectedXLMap || !this.xlData.length) return
this.status = Status.Submitting
this.isLoading = true
this.isLoadingDesc = 'Submitting extracted data'
const filesToUpload: UploadFile[] = []
for (const file of this.uploader.queue) {
filesToUpload.push({
file: file,
fileName: file.name
})
}
const csvContent =
Object.keys(this.xlData[0]).join(',') +
'\n' +
this.xlData
.slice(0, this.licenceState.value.submit_rows_limit)
.map((row: any) => Object.values(row).join(','))
.join('\n')
const blob = new Blob([csvContent], { type: 'application/csv' })
const file: File = blobToFile(blob, this.filename + '.csv')
filesToUpload.push({
file: file,
fileName: file.name
})
const uploadUrl = 'services/editors/loadfile'
this.sasService
.uploadFile(uploadUrl, filesToUpload, {
table: this.selectedXLMap.targetDS
})
.then((res: any) => {
if (res.sasjsAbort) {
const abortRes = res
const abortMsg = abortRes.sasjsAbort[0].MSG
const macMsg = abortRes.sasjsAbort[0].MAC
this.eventService.showAbortModal('', abortMsg, {
SYSWARNINGTEXT: abortRes.SYSWARNINGTEXT,
SYSERRORTEXT: abortRes.SYSERRORTEXT,
MAC: macMsg
})
} else if (res.sasparams) {
const params = res.sasparams[0]
const tableId = params.DSID
this.router.navigateByUrl('/stage/' + tableId)
}
})
.catch((err: any) => {
this.eventService.catchResponseError('file upload', err)
})
.finally(() => {
this.status = Status.ReadyToSubmit
this.isLoading = false
this.isLoadingDesc = ''
})
}
public extractData(wb: XLSX.WorkBook) {
const extractedData: XLUploadEntry[] = []
this.xlmapRules.forEach((rule) => {
let sheetName = rule.XLMAP_SHEET
// if sheet name is not an absolute name rather an index string like /1, /2, etc
// we extract the index and find absolute sheet name for specified index
if (sheetName.startsWith('/')) {
const temp = sheetName.split('/')[1]
const sheetIndex = parseInt(temp) - 1
sheetName = wb.SheetNames[sheetIndex]
}
const sheet = wb.Sheets[sheetName]
const arrayOfObjects = <any[]>XLSX.utils.sheet_to_json(sheet, {
raw: true,
header: 'A',
blankrows: true
})
const start = getCellAddress(rule.XLMAP_START, arrayOfObjects)
const finish = getFinishingCell(start, rule.XLMAP_FINISH, arrayOfObjects)
const range = `${start}:${finish}`
const rangedData = <any[]>XLSX.utils.sheet_to_json(sheet, {
raw: true,
range: range,
header: 'A',
blankrows: true
})
for (let i = 0; i < rangedData.length; i++) {
const row = rangedData[i]
// Get the keys of the object (excluding '__rowNum__')
const keys = Object.keys(row).filter((key) => key !== '__rowNum__')
for (let j = 0; j < keys.length; j++) {
const key = keys[j]
const val = row[key]
// in excel's R1C1 notation indexing starts from 1 but in JS it starts from 0
// therefore, we'll have to add 1 to rows and cols
extractedData.push({
LOAD_REF: '0',
XLMAP_ID: rule.XLMAP_ID,
XLMAP_RANGE_ID: rule.XLMAP_RANGE_ID,
ROW_NO: i + 1,
COL_NO: j + 1,
VALUE_TXT: val
})
}
}
})
this.status = Status.ReadyToSubmit
this.isLoading = false
this.isLoadingDesc = ''
this.xlData = extractedData
this.selectedTab = Tabs.Data
}
async viewXLMapRules() {
if (!this.selectedXLMap) return
this.isLoading = true
this.isLoadingDesc = 'Loading excel rules'
this.status = Status.FetchingRules
await this.sasStoreService
.getXLMapRules(this.selectedXLMap.id)
.then((res) => {
this.xlmapRules = res.xlmaprules
this.status = Status.ReadyToUpload
})
.catch((err) => {
this.loggerService.error(err)
})
this.isLoading = false
this.isLoadingDesc = ''
}
private load() {
this.xlmaps = globals.xlmaps
this.xlmapsLoading = false
const id = this.route.snapshot.params['id']
if (id) {
const xlmapListItem = this.xlmaps.find((item) => item.id === id)
if (xlmapListItem) {
this.selectedXLMap = xlmapListItem
this.viewXLMapRules()
}
}
}
ngOnInit() {
this.licenceService.hot_license_key.subscribe(
(hot_license_key: string | undefined) => {
this.hotTableLicenseKey = hot_license_key
}
)
}
ngAfterViewInit() {
return
}
ngAfterContentInit(): void {
if (globals.editor.startupSet) {
this.load()
} else {
this.eventService.onStartupDataLoaded.subscribe(() => {
this.load()
})
}
}
}

View File

@ -0,0 +1,31 @@
import { CommonModule } from '@angular/common'
import { NgModule } from '@angular/core'
import { FormsModule } from '@angular/forms'
import { ClarityModule } from '@clr/angular'
import { HotTableModule } from '@handsontable/angular'
import { registerAllModules } from 'handsontable/registry'
import { AppSharedModule } from '../app-shared.module'
import { DirectivesModule } from '../directives/directives.module'
import { XLMapRouteComponent } from '../routes/xlmap-route/xlmap-route.component'
import { DcTreeModule } from '../shared/dc-tree/dc-tree.module'
import { XLMapRoutingModule } from './xlmap-routing.module'
import { XLMapComponent } from './xlmap.component'
// register Handsontable's modules
registerAllModules()
@NgModule({
declarations: [XLMapRouteComponent, XLMapComponent],
imports: [
HotTableModule,
XLMapRoutingModule,
FormsModule,
ClarityModule,
AppSharedModule,
CommonModule,
DcTreeModule,
DirectivesModule
],
exports: [XLMapComponent]
})
export class XLMapModule {}

View File

@ -18,4 +18,4 @@ In any case, you must not make any such use of this software as to develop softw
UNLESS EXPRESSLY AGREED OTHERWISE, 4GL APPS PROVIDES THIS SOFTWARE ON AN "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, AND IN NO EVENT AND UNDER NO LEGAL THEORY, SHALL 4GL APPS BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY CHARACTER ARISING FROM USE OR INABILITY TO USE THIS SOFTWARE. UNLESS EXPRESSLY AGREED OTHERWISE, 4GL APPS PROVIDES THIS SOFTWARE ON AN "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, AND IN NO EVENT AND UNDER NO LEGAL THEORY, SHALL 4GL APPS BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES OF ANY CHARACTER ARISING FROM USE OR INABILITY TO USE THIS SOFTWARE.
` `

View File

@ -288,8 +288,8 @@ function resolveDateString(data, ca, component, width, key) {
resolved = hop.call(obj, width) resolved = hop.call(obj, width)
? obj[width] ? obj[width]
: hop.call(obj, alts[width][0]) : hop.call(obj, alts[width][0])
? obj[alts[width][0]] ? obj[alts[width][0]]
: obj[alts[width][1]] : obj[alts[width][1]]
// `key` wouldn't be specified for components 'dayPeriods' // `key` wouldn't be specified for components 'dayPeriods'
return key != null ? resolved[key] : resolved return key != null ? resolved[key] : resolved

View File

@ -1,16 +1,17 @@
/* You can add global styles to this file, and also import other style files */ /* You can add global styles to this file, and also import other style files */
@import '~handsontable/dist/handsontable.full.css'; @import '~handsontable/dist/handsontable.full.css';
@import "~@clr/ui/clr-ui.min.css"; @import '~@clr/ui/clr-ui.min.css';
@import "~@clr/icons/clr-icons.min.css"; @import '~@clr/icons/clr-icons.min.css';
@font-face{ @font-face {
font-family: text-security-disc; font-family: text-security-disc;
src: url("https://raw.githubusercontent.com/noppa/text-security/master/dist/text-security-disc.woff"); src: url('https://raw.githubusercontent.com/noppa/text-security/master/dist/text-security-disc.woff');
} }
body, html { body,
font-weight: 400!important; html {
font-weight: 400 !important;
padding: 0; padding: 0;
margin: 0; margin: 0;
@ -29,7 +30,7 @@ button {
} }
// Custom loading spinner // Custom loading spinner
.slider{ .slider {
position: absolute; position: absolute;
width: 320px; width: 320px;
margin-left: 75px; margin-left: 75px;
@ -38,33 +39,45 @@ button {
overflow-x: hidden; overflow-x: hidden;
} }
.line{ .line {
position:absolute; position: absolute;
opacity: 0.4; opacity: 0.4;
background:#73D544; background: #73d544;
width:150%; width: 150%;
height:5px; height: 5px;
} }
.subline{ .subline {
position:absolute; position: absolute;
background:#73D544; background: #73d544;
height:5px; height: 5px;
} }
.inc{ .inc {
animation: increase 2s infinite; animation: increase 2s infinite;
} }
.dec{ .dec {
animation: decrease 2s 0.5s infinite; animation: decrease 2s 0.5s infinite;
} }
@keyframes increase { @keyframes increase {
from { left: -5%; width: 5%; } from {
to { left: 130%; width: 100%;} left: -5%;
width: 5%;
}
to {
left: 130%;
width: 100%;
}
} }
@keyframes decrease { @keyframes decrease {
from { left: -80%; width: 80%; } from {
to { left: 110%; width: 10%;} left: -80%;
width: 80%;
}
to {
left: 110%;
width: 10%;
}
} }
// Custo loading spinner end // Custo loading spinner end
@ -276,6 +289,10 @@ button {
margin-bottom: 10px; margin-bottom: 10px;
} }
.mb-10-i {
margin-bottom: 10px !important;
}
.mb-20 { .mb-20 {
margin-bottom: 20px; margin-bottom: 20px;
} }
@ -321,11 +338,11 @@ button {
} }
.color-dark-gray { .color-dark-gray {
color: #495967 color: #495967;
} }
.color-darker-gray{ .color-darker-gray {
color: #314351 color: #314351;
} }
.color-white { .color-white {
@ -333,7 +350,7 @@ button {
} }
.color-white-i { .color-white-i {
color: white !important color: white !important;
} }
.color-green { .color-green {
@ -341,15 +358,15 @@ button {
} }
.color-dc-green { .color-dc-green {
color: #81b440 color: #81b440;
} }
.color-red { .color-red {
color: #e45454 color: #e45454;
} }
.color-orange { .color-orange {
color: #E67E22; color: #e67e22;
} }
.color-blue { .color-blue {
@ -357,7 +374,7 @@ button {
} }
.color-yellow { .color-yellow {
color: #f1c40f color: #f1c40f;
} }
.cursor-pointer { .cursor-pointer {
@ -501,7 +518,7 @@ button {
} }
.z-index-highest { .z-index-highest {
z-index: 10000000 z-index: 10000000;
} }
.vertical-align-middle { .vertical-align-middle {
@ -519,35 +536,36 @@ button {
} }
.progresStatic { .progresStatic {
margin-top:-6px!important; margin-top: -6px !important;
position: absolute!important; position: absolute !important;
z-index: 10000!important; z-index: 10000 !important;
} }
.progress, .progress-static { .progress,
.progress-static {
background-color: #f5f6fe; background-color: #f5f6fe;
border-radius: 0; border-radius: 0;
font-size: inherit; font-size: inherit;
height: 6px; height: 6px;
margin: 0; margin: 0;
max-height: .583333rem; max-height: 0.583333rem;
min-height: .166667rem; min-height: 0.166667rem;
overflow: hidden; overflow: hidden;
display: block; display: block;
width: calc(100% - 63px); width: calc(100% - 63px);
} }
.progress.loop:after { .progress.loop:after {
-webkit-animation: clr-progress-looper 1.5s ease-in-out infinite; -webkit-animation: clr-progress-looper 1.5s ease-in-out infinite;
animation: clr-progress-looper 1.5s ease-in-out infinite; animation: clr-progress-looper 1.5s ease-in-out infinite;
content: " "; content: ' ';
top: .166667rem; top: 0.166667rem;
bottom: 0; bottom: 0;
left: 0; left: 0;
position: absolute; position: absolute;
display: block; display: block;
background-color: #60b515; background-color: #60b515;
width: 75%; width: 75%;
} }
// Fix for clarity bug, should be addressed when clarity is updated // Fix for clarity bug, should be addressed when clarity is updated
@ -570,9 +588,9 @@ button {
} }
.alert-app-level.alert-danger { .alert-app-level.alert-danger {
background: #D94B2E; background: #d94b2e;
color: #fff; color: #fff;
border: none; border: none;
} }
.card-header { .card-header {
@ -581,7 +599,7 @@ button {
.select select:focus { .select select:focus {
border-bottom: 1px solid #495967; border-bottom: 1px solid #495967;
background: linear-gradient(180deg,transparent 95%,#495a67 0) no-repeat; background: linear-gradient(180deg, transparent 95%, #495a67 0) no-repeat;
} }
.clr-treenode-children { .clr-treenode-children {
@ -597,7 +615,9 @@ button {
background: #d8e3e9; background: #d8e3e9;
} }
clr-select-container .clr-control-container, clr-select-container .clr-control-container .clr-select-wrapper, clr-select-container select { clr-select-container .clr-control-container,
clr-select-container .clr-control-container .clr-select-wrapper,
clr-select-container select {
width: 100%; width: 100%;
} }
@ -605,42 +625,46 @@ tbody {
font-weight: 400; font-weight: 400;
} }
h3, h4 { h3,
color: #585858; h4 {
font-weight: 400; color: #585858;
letter-spacing: normal; font-weight: 400;
line-height: 1rem; letter-spacing: normal;
margin-top: 1rem; line-height: 1rem;
margin-bottom: 0; margin-top: 1rem;
/* text-transform: uppercase; */ margin-bottom: 0;
/* text-transform: uppercase; */
} }
h1, h2 { h1,
color: #585858; h2 {
font-weight: 400; color: #585858;
/* font-family: Metropolis,Avenir Next,Helvetica Neue,Arial,sans-serif; */ font-weight: 400;
letter-spacing: normal; /* font-family: Metropolis,Avenir Next,Helvetica Neue,Arial,sans-serif; */
line-height: 2rem; letter-spacing: normal;
margin-top: 1rem; line-height: 2rem;
margin-bottom: 0; margin-top: 1rem;
/* text-transform: uppercase; */ margin-bottom: 0;
/* text-transform: uppercase; */
} }
clr-icon.is-info { clr-icon.is-info {
fill: #80b441; fill: #80b441;
} }
.datagrid-host, .datagrid-overlay-wrapper { .datagrid-host,
display: -webkit-box; .datagrid-overlay-wrapper {
display: -ms-flexbox; display: -webkit-box;
display: -webkit-box!important; display: -ms-flexbox;
-webkit-box-direction: normal; display: -webkit-box !important;
-webkit-box-direction: normal;
} }
.btn.btn-danger, .btn.btn-warning { .btn.btn-danger,
border-color: #ef4f2e; .btn.btn-warning {
background-color: #D94B2E; border-color: #ef4f2e;
color: #fff; background-color: #d94b2e;
color: #fff;
} }
.d-none { .d-none {
@ -685,11 +709,11 @@ clr-icon.is-info {
} }
.handsontable td.htInvalid { .handsontable td.htInvalid {
background: #e62700ad!important; background: #e62700ad !important;
border: 1px solid red !important; border: 1px solid red !important;
color: #ffffff!important; color: #ffffff !important;
} }
.margin-top-20{ .margin-top-20 {
margin-top: 20px; margin-top: 20px;
} }
.hidden { .hidden {
@ -823,7 +847,7 @@ clr-icon.is-info {
} }
.datagrid-body { .datagrid-body {
padding-bottom: 2rem!important; padding-bottom: 2rem !important;
} }
.abortMsg { .abortMsg {
@ -831,16 +855,15 @@ clr-icon.is-info {
font-family: monospace; font-family: monospace;
} }
#graph svg { #graph svg {
height: 100%; height: 100%;
width: 100%; width: 100%;
} }
.no-table-selected { .no-table-selected {
display:flex; display: flex;
justify-content:center; justify-content: center;
flex-direction:column; flex-direction: column;
align-items: center; align-items: center;
position: absolute; position: absolute;
background: white; background: white;
@ -851,16 +874,15 @@ clr-icon.is-info {
} }
.copyRight { .copyRight {
background:#495967!important; background: #495967 !important;
color: #fff; color: #fff;
display:flex !important; display: flex !important;
justify-content:center; justify-content: center;
align-items: center; align-items: center;
padding: 5px 0px 4px 0px; padding: 5px 0px 4px 0px;
z-index: 100; z-index: 100;
} }
.nav-tree > clr-tree-node.clr-expanded { .nav-tree > clr-tree-node.clr-expanded {
display: inline-block !important; display: inline-block !important;
} }
@ -903,13 +925,13 @@ clr-tree-node {
} }
.tree-search-wrapper { .tree-search-wrapper {
position: relative; position: relative;
display: flex; display: flex;
align-items: center; align-items: center;
clr-input-container { clr-input-container {
margin: 0; margin: 0;
} }
clr-icon { clr-icon {
position: absolute; position: absolute;
@ -956,7 +978,8 @@ input::-ms-clear {
overflow: hidden !important; overflow: hidden !important;
} }
.clr-treenode-content .clr-icon, .clr-treenode-content clr-icon { .clr-treenode-content .clr-icon,
.clr-treenode-content clr-icon {
min-width: 16px; min-width: 16px;
min-height: 16px; min-height: 16px;
} }
@ -985,12 +1008,12 @@ input::-ms-clear {
} }
.loadingSpinner { .loadingSpinner {
height:70vh; height: 70vh;
flex: 1; flex: 1;
display:flex; display: flex;
justify-content: center; justify-content: center;
flex-direction:column; flex-direction: column;
align-items:center; align-items: center;
} }
.disable-password-manager { .disable-password-manager {
@ -1025,7 +1048,8 @@ hr.light {
position: relative; position: relative;
min-width: 170px; min-width: 170px;
clr-icon, .spinner { clr-icon,
.spinner {
position: absolute; position: absolute;
right: 19px; right: 19px;
top: 0px; top: 0px;
@ -1063,7 +1087,7 @@ hr.light {
} }
/* Firefox */ /* Firefox */
input[type=number] { input[type='number'] {
-moz-appearance: textfield; -moz-appearance: textfield;
} }
} }
@ -1076,4 +1100,4 @@ hr.light {
.link-it { .link-it {
cursor: pointer; cursor: pointer;
text-decoration: underline; text-decoration: underline;
} }

6480
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@ -1,15 +1,15 @@
{ {
"name": "dcfrontend", "name": "dcfrontend",
"version": "6.2.0", "version": "6.4.0",
"description": "Data Controller", "description": "Data Controller",
"devDependencies": { "devDependencies": {
"@saithodev/semantic-release-gitea": "^2.1.0", "@saithodev/semantic-release-gitea": "^2.1.0",
"@semantic-release/changelog": "^6.0.3", "@semantic-release/changelog": "^6.0.3",
"@semantic-release/commit-analyzer": "^10.0.1", "@semantic-release/commit-analyzer": "^10.0.1",
"@semantic-release/npm": "11.0.0",
"@semantic-release/git": "^10.0.1", "@semantic-release/git": "^10.0.1",
"@semantic-release/release-notes-generator": "^11.0.4", "@semantic-release/release-notes-generator": "^11.0.4",
"commit-and-tag-version": "^11.2.2", "commit-and-tag-version": "^11.2.2"
"prettier": "3.0.0"
}, },
"scripts": { "scripts": {
"install": "cd client && npm i && cd ../sas && npm i", "install": "cd client && npm i && cd ../sas && npm i",
@ -23,5 +23,10 @@
"repository": { "repository": {
"type": "git", "type": "git",
"url": "https://git.datacontroller.io/dc/dc.git" "url": "https://git.datacontroller.io/dc/dc.git"
} },
"private": true,
"//": [
"Readme",
"We must set private: true so that semantic-release/npm plugin will update the package.json version but not try to release it as NPM package"
]
} }

View File

@ -83,6 +83,12 @@ _webout = `{"SYSDATE" : "26SEP22"
"DC_RESTRICT_EDITRECORD": "NO" "DC_RESTRICT_EDITRECORD": "NO"
} }
] ]
,"xlmaps":
[
["BASEL-CR2" ,"" ,"DC695588.MPE_XLMAP_DATA" ]
,["BASEL-KM1" ,"Basel 3 Key Metrics report" ,"DC695588.MPE_XLMAP_DATA" ]
,["SAMPLE" ,"" ,"DC695588.MPE_XLMAP_DATA" ]
]
,"_DEBUG" : "" ,"_DEBUG" : ""
,"_METAUSER": "sasdemo@SAS" ,"_METAUSER": "sasdemo@SAS"
,"_METAPERSON": "sasdemo" ,"_METAPERSON": "sasdemo"

26
sas/package-lock.json generated
View File

@ -7,7 +7,7 @@
"name": "dc-sas", "name": "dc-sas",
"dependencies": { "dependencies": {
"@sasjs/cli": "^4.11.1", "@sasjs/cli": "^4.11.1",
"@sasjs/core": "^4.47.0" "@sasjs/core": "^4.49.0"
} }
}, },
"node_modules/@coolaj86/urequest": { "node_modules/@coolaj86/urequest": {
@ -116,9 +116,9 @@
"integrity": "sha512-Grwydm5GxBsYk238PZw41XPjXVVQ9vWcvfZ06L2P0bQbvK0sGn7l69JA7H5MGr3QcaLpiD4Kg70cAh7PgE+JOw==" "integrity": "sha512-Grwydm5GxBsYk238PZw41XPjXVVQ9vWcvfZ06L2P0bQbvK0sGn7l69JA7H5MGr3QcaLpiD4Kg70cAh7PgE+JOw=="
}, },
"node_modules/@sasjs/core": { "node_modules/@sasjs/core": {
"version": "4.47.0", "version": "4.49.0",
"resolved": "https://registry.npmjs.org/@sasjs/core/-/core-4.47.0.tgz", "resolved": "https://registry.npmjs.org/@sasjs/core/-/core-4.49.0.tgz",
"integrity": "sha512-ysSii9kTZuUsCfjaCu3coKtdnBmFF03EoUnddmAUKLbrD7y5/m3fsiDrpTYoAJ75+xuzAx+ssj0xGG3gVqCm7w==" "integrity": "sha512-hp3Hb4DkT6FmowyNHTOvSlgmSObW9WeuTJj+TQlwPgnBo59mAB4XFUnUaYSA+7ghvsHqUZf1OP2eSYqmnN5swQ=="
}, },
"node_modules/@sasjs/lint": { "node_modules/@sasjs/lint": {
"version": "2.3.1", "version": "2.3.1",
@ -230,9 +230,9 @@
} }
}, },
"node_modules/@types/tough-cookie": { "node_modules/@types/tough-cookie": {
"version": "4.0.3", "version": "4.0.5",
"resolved": "https://registry.npmjs.org/@types/tough-cookie/-/tough-cookie-4.0.3.tgz", "resolved": "https://registry.npmjs.org/@types/tough-cookie/-/tough-cookie-4.0.5.tgz",
"integrity": "sha512-THo502dA5PzG/sfQH+42Lw3fvmYkceefOspdCwpHRul8ik2Jv1K8I5OZz1AT3/rs46kwgMCe9bSBmDLYkkOMGg==", "integrity": "sha512-/Ad8+nIOV7Rl++6f1BdKxFSMgmoqEoYbHRpPcx3JEfv8VRsQe9Z4mCXeJBzxs7mbHY/XOZZuXlRNfhpVPbs6ZA==",
"peer": true "peer": true
}, },
"node_modules/abab": { "node_modules/abab": {
@ -1834,9 +1834,9 @@
} }
}, },
"@sasjs/core": { "@sasjs/core": {
"version": "4.47.0", "version": "4.49.0",
"resolved": "https://registry.npmjs.org/@sasjs/core/-/core-4.47.0.tgz", "resolved": "https://registry.npmjs.org/@sasjs/core/-/core-4.49.0.tgz",
"integrity": "sha512-ysSii9kTZuUsCfjaCu3coKtdnBmFF03EoUnddmAUKLbrD7y5/m3fsiDrpTYoAJ75+xuzAx+ssj0xGG3gVqCm7w==" "integrity": "sha512-hp3Hb4DkT6FmowyNHTOvSlgmSObW9WeuTJj+TQlwPgnBo59mAB4XFUnUaYSA+7ghvsHqUZf1OP2eSYqmnN5swQ=="
}, },
"@sasjs/lint": { "@sasjs/lint": {
"version": "2.3.1", "version": "2.3.1",
@ -1934,9 +1934,9 @@
} }
}, },
"@types/tough-cookie": { "@types/tough-cookie": {
"version": "4.0.3", "version": "4.0.5",
"resolved": "https://registry.npmjs.org/@types/tough-cookie/-/tough-cookie-4.0.3.tgz", "resolved": "https://registry.npmjs.org/@types/tough-cookie/-/tough-cookie-4.0.5.tgz",
"integrity": "sha512-THo502dA5PzG/sfQH+42Lw3fvmYkceefOspdCwpHRul8ik2Jv1K8I5OZz1AT3/rs46kwgMCe9bSBmDLYkkOMGg==", "integrity": "sha512-/Ad8+nIOV7Rl++6f1BdKxFSMgmoqEoYbHRpPcx3JEfv8VRsQe9Z4mCXeJBzxs7mbHY/XOZZuXlRNfhpVPbs6ZA==",
"peer": true "peer": true
}, },
"abab": { "abab": {

View File

@ -14,7 +14,8 @@
"sas9e": "sasjs request services/admin/makedata -d deploy/makeDataSas9.json -t sas9 ", "sas9e": "sasjs request services/admin/makedata -d deploy/makeDataSas9.json -t sas9 ",
"sas9f": "sasjs request services/admin/refreshtablelineage -t sas9 ", "sas9f": "sasjs request services/admin/refreshtablelineage -t sas9 ",
"sas9g": "sasjs request services/admin/refreshcatalog -t sas9", "sas9g": "sasjs request services/admin/refreshcatalog -t sas9",
"4gl": "npm run cpfavicon && sasjs cbd -t 4gl && sasjs request services/admin/makedata -d deploy/makeData4GL.json -l sasjsresults/makedata_4gl.log -o sasjsresults/makedata_4gl.json -t 4gl", "4gl": "npm run cpfavicon && sasjs cbd -t 4gl && npm run 4glmakedata",
"4glmakedata": "sasjs request services/admin/makedata -d deploy/makeData4GL.json -l sasjsresults/makedata_4gl.log -o sasjsresults/makedata_4gl.json -t 4gl",
"server": "npm run cpfavicon && sasjs cbd -t server && npm run serverdata", "server": "npm run cpfavicon && sasjs cbd -t server && npm run serverdata",
"server-mihajlo": "npm run cpfavicon && sasjs cbd -t server-mihajlo && npm run serverdata-mihajlo", "server-mihajlo": "npm run cpfavicon && sasjs cbd -t server-mihajlo && npm run serverdata-mihajlo",
"serverdata-mihajlo": "sasjs request services/admin/makedata -d deploy/makeDataServer.json -l sasjsresults/makedata_server.log -o sasjsresults/makedata_server.json -t server-mihajlo", "serverdata-mihajlo": "sasjs request services/admin/makedata -d deploy/makeDataServer.json -l sasjsresults/makedata_server.log -o sasjsresults/makedata_server.json -t server-mihajlo",
@ -28,6 +29,6 @@
"private": true, "private": true,
"dependencies": { "dependencies": {
"@sasjs/cli": "^4.11.1", "@sasjs/cli": "^4.11.1",
"@sasjs/core": "^4.47.0" "@sasjs/core": "^4.49.0"
} }
} }

View File

@ -0,0 +1,18 @@
/**
@file
@brief DDL for MPE_XLMAP_DATA
@version 9.3
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd
**/
create table &curlib..MPE_XLMAP_DATA(
LOAD_REF char(32) not null,
XLMAP_ID char(32) not null,
XLMAP_RANGE_ID char(32) not null,
ROW_NO num not null,
COL_NO num not null,
VALUE_TXT char(4000),
constraint pk_MPE_XLMAP_DATA
primary key(LOAD_REF, XLMAP_ID, XLMAP_RANGE_ID, ROW_NO, COL_NO));

View File

@ -0,0 +1,17 @@
/**
@file
@brief DDL for mpe_xlmap_info
@version 9.3
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd
**/
create table &curlib..mpe_xlmap_info(
tx_from num not null,
XLMAP_ID char(32) not null,
XLMAP_DESCRIPTION char(1000) not null,
XLMAP_TARGETLIBDS char(41) not null,
tx_to num not null,
constraint pk_mpe_xlmap_info
primary key(tx_from,XLMAP_ID));

View File

@ -0,0 +1,19 @@
/**
@file
@brief DDL for mpe_xlmap_rules
@version 9.3
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd
**/
create table &curlib..mpe_xlmap_rules(
tx_from num not null,
XLMAP_ID char(32) not null,
XLMAP_RANGE_ID char(32) not null,
XLMAP_SHEET char(32) not null,
XLMAP_START char(1000) not null,
XLMAP_FINISH char(1000),
tx_to num not null,
constraint pk_mpe_xlmap_rules
primary key(tx_from,XLMAP_ID,XLMAP_RANGE_ID));

View File

@ -0,0 +1,44 @@
/**
@file
@brief migration script to move from v5 to v6.5 of data controller
**/
%let dclib=YOURDCLIB;
libname &dclib "/your/dc/path";
/**
* Change 1
* New MPE_SUBMIT table
*/
proc sql;
create table &dclib..mpe_xlmap_rules(
tx_from num not null,
XLMAP_ID char(32) not null,
XLMAP_RANGE_ID char(32) not null,
XLMAP_SHEET char(32) not null,
XLMAP_START char(1000) not null,
XLMAP_FINISH char(1000),
tx_to num not null,
constraint pk_mpe_xlmap_rules
primary key(tx_from,XLMAP_ID,XLMAP_RANGE_ID));
create table &dclib..MPE_XLMAP_DATA(
LOAD_REF char(32) not null,
XLMAP_ID char(32) not null,
XLMAP_RANGE_ID char(32) not null,
ROW_NO num not null,
COL_NO num not null,
VALUE_TXT char(4000),
constraint pk_MPE_XLMAP_DATA
primary key(LOAD_REF, XLMAP_ID, XLMAP_RANGE_ID, ROW_NO, COL_NO));
create table &dclib..mpe_xlmap_info(
tx_from num not null,
XLMAP_ID char(32) not null,
XLMAP_DESCRIPTION char(1000) not null,
XLMAP_TARGETLIBDS char(41) not null,
tx_to num not null,
constraint pk_mpe_xlmap_info
primary key(tx_from,XLMAP_ID));

View File

@ -2,7 +2,7 @@
This site contains the SAS code used in Data Controller for SAS. The pages were generated using [`sasjs doc`](https://cli.sasjs.io/doc). This site contains the SAS code used in Data Controller for SAS. The pages were generated using [`sasjs doc`](https://cli.sasjs.io/doc).
You can download Data Controller from [here](https://4gl.uk/dcdeploy). You can download Data Controller from [here](https://git.datacontroller.io/dc/dc/releases).
The main website is [https://datacontroller.io](https://datacontroller.io) and the user guide is [here](https://docs.datacontroller.io). The main website is [https://datacontroller.io](https://datacontroller.io) and the user guide is [here](https://docs.datacontroller.io).

View File

@ -131,7 +131,7 @@ filename __out email ("&emails")
txt=symget('SUBMITTED_TXT'); txt=symget('SUBMITTED_TXT');
put "Reason provided: " txt; put "Reason provided: " txt;
put " "; put " ";
put "This is an automated email by Data Controller for SAS®. For " put "This is an automated email by Data Controller for SAS. For "
"documentation, please visit https://docs.datacontroller.io"; "documentation, please visit https://docs.datacontroller.io";
run; run;
%end; %end;
@ -144,7 +144,7 @@ filename __out email ("&emails")
put "Please be advised that a change to table &alert_lib..&alert_ds has " put "Please be advised that a change to table &alert_lib..&alert_ds has "
"been approved by &from_user on the '&syshostname' SAS server."; "been approved by &from_user on the '&syshostname' SAS server.";
put " "; put " ";
put "This is an automated email by Data Controller for SAS®. For " put "This is an automated email by Data Controller for SAS. For "
"documentation, please visit https://docs.datacontroller.io"; "documentation, please visit https://docs.datacontroller.io";
run; run;
%end; %end;
@ -165,7 +165,7 @@ filename __out email ("&emails")
txt=symget('REVIEW_REASON_TXT'); txt=symget('REVIEW_REASON_TXT');
put "Reason provided: " txt; put "Reason provided: " txt;
put " "; put " ";
put "This is an automated email by Data Controller for SAS®. For " put "This is an automated email by Data Controller for SAS. For "
"documentation, please visit https://docs.datacontroller.io"; "documentation, please visit https://docs.datacontroller.io";
run; run;
%end; %end;

View File

@ -24,6 +24,7 @@
@li mp_lockanytable.sas @li mp_lockanytable.sas
@li mpe_accesscheck.sas @li mpe_accesscheck.sas
@li mpe_alerts.sas @li mpe_alerts.sas
@li mpe_xlmapvalidate.sas
@li mpe_loadfail.sas @li mpe_loadfail.sas
@li mpe_runhook.sas @li mpe_runhook.sas
@ -450,7 +451,7 @@ run;
%do i=1 %to %sysfunc(countw(&pk)); %do i=1 %to %sysfunc(countw(&pk));
%let iWord=%scan(&pk,&i); %let iWord=%scan(&pk,&i);
call symputx('duplist',symget('duplist')!! call symputx('duplist',symget('duplist')!!
" &iWord="!!trim(&iWord)); " &iWord="!!cats(&iWord));
%end; %end;
run; run;
%let msg=This upload contains duplicates on the Primary Key columns %trim( %let msg=This upload contains duplicates on the Primary Key columns %trim(
@ -472,6 +473,10 @@ run;
%return; %return;
%end; %end;
/* If a Complex Excel Upload, needs to have the load ref added to the table */
%mpe_xlmapvalidate(&mperef,work.staging_ds,&mpelib,&orig_libds)
/* Run the Post Edit Hook prior to creation of staging folder */
%mpe_runhook(POST_EDIT_HOOK) %mpe_runhook(POST_EDIT_HOOK)
/* stop if err */ /* stop if err */

View File

@ -269,6 +269,210 @@ insert into &lib..mpe_datadictionary set
,DD_SENSITIVITY="Low" ,DD_SENSITIVITY="Low"
,tx_to='31DEC5999:23:59:59'dt; ,tx_to='31DEC5999:23:59:59'dt;
/**
* mpe_xlmap_info
*/
insert into &lib..mpe_xlmap_info set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_description='Basel 3 Key Metrics report'
,XLMAP_TARGETLIBDS="&lib..MPE_XLMAP_DATA";
/**
* mpe_xlmap_rules
*/
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:a'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:a';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:b'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:b';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:c'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:c';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:d'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:d';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:e'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:e';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:f'
,xlmap_sheet='KM1'
,xlmap_start='MATCH 4 R[2]C[0]:f';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/a'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[1]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/b'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[2]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/c'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[3]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/d'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[4]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/e'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[5]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1/f'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[0]C[6]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1a/e'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[1]C[5]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:1a/f'
,xlmap_sheet='KM1'
,xlmap_start='MATCH C R[1]C[6]:Common Equity Tier 1 (CET1)';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/a'
,xlmap_sheet='KM1'
,xlmap_start='ABSOLUTE D10';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/b'
,xlmap_sheet='/3'
,xlmap_start='ABSOLUTE E10';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/c'
,xlmap_sheet='/3'
,xlmap_start='RELATIVE R[10]C[6]';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/d'
,xlmap_sheet='/3'
,xlmap_start='RELATIVE R[10]C[8]';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/e'
,xlmap_sheet='/3'
,xlmap_start='RELATIVE R[10]C[9]';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2/f'
,xlmap_sheet='/3'
,xlmap_start='RELATIVE R[10]C[10]';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:2a'
,xlmap_sheet='KM1'
,xlmap_start='ABSOLUTE H11'
,xlmap_finish='RELATIVE R[0]C[1]';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-KM1'
,xlmap_range_id='KM1:3'
,xlmap_sheet='KM1'
,xlmap_start='RELATIVE R[12]C[4]'
,xlmap_finish='ABSOLUTE I13';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-CR2'
,xlmap_range_id='CR2-sec1'
,xlmap_sheet='CR2'
,xlmap_start='ABSOLUTE D8'
,xlmap_finish='BLANKROW';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='BASEL-CR2'
,xlmap_range_id='CR2-sec2'
,xlmap_sheet='CR2'
,xlmap_start='ABSOLUTE D18'
,xlmap_finish='LASTDOWN';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='SAMPLE'
,xlmap_range_id='header'
,xlmap_sheet='/1'
,xlmap_start='ABSOLUTE B3'
,xlmap_finish='ABSOLUTE B8';
insert into &lib..mpe_xlmap_rules set
tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,xlmap_id='SAMPLE'
,xlmap_range_id='data'
,xlmap_sheet='/1'
,xlmap_start='ABSOLUTE B13'
,xlmap_finish='ABSOLUTE E16';
/** /**
* MPE_GROUPS * MPE_GROUPS
*/ */
@ -981,6 +1185,42 @@ insert into &lib..mpe_selectbox set
,notes='Docs: https://docs.datacontroller.io/column-level-security' ,notes='Docs: https://docs.datacontroller.io/column-level-security'
,post_edit_hook='services/hooks/mpe_column_level_security_postedit' ,post_edit_hook='services/hooks/mpe_column_level_security_postedit'
; ;
insert into &lib..mpe_tables
set tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,libref="&lib"
,dsn='MPE_XLMAP_INFO'
,num_of_approvals_required=1
,loadtype='TXTEMPORAL'
,var_txfrom='TX_FROM'
,var_txto='TX_TO'
,buskey='XLMAP_ID'
,notes='Docs: https://docs.datacontroller.io/complex-excel-uploads'
,post_edit_hook='services/hooks/mpe_xlmap_info_postedit'
;
insert into &lib..mpe_tables
set tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,libref="&lib"
,dsn='MPE_XLMAP_RULES'
,num_of_approvals_required=1
,loadtype='TXTEMPORAL'
,var_txfrom='TX_FROM'
,var_txto='TX_TO'
,buskey='XLMAP_ID XLMAP_RANGE_ID'
,notes='Docs: https://docs.datacontroller.io/complex-excel-uploads'
,post_edit_hook='services/hooks/mpe_xlmap_rules_postedit'
;
insert into &lib..mpe_tables
set tx_from=0
,tx_to='31DEC5999:23:59:59'dt
,libref="&lib"
,dsn='MPE_XLMAP_DATA'
,num_of_approvals_required=1
,loadtype='UPDATE'
,buskey='LOAD_REF XLMAP_ID XLMAP_RANGE_ID ROW_NO COL_NO'
,notes='Docs: https://docs.datacontroller.io/complex-excel-uploads'
;
insert into &lib..mpe_tables insert into &lib..mpe_tables
set tx_from=0 set tx_from=0
,tx_to='31DEC5999:23:59:59'dt ,tx_to='31DEC5999:23:59:59'dt
@ -1253,6 +1493,27 @@ insert into &lib..MPE_VALIDATIONS set
,rule_value="services/validations/mpe_alerts.alert_lib" ,rule_value="services/validations/mpe_alerts.alert_lib"
,rule_active=1 ,rule_active=1
,tx_to='31DEC5999:23:59:59'dt; ,tx_to='31DEC5999:23:59:59'dt;
insert into &lib..MPE_VALIDATIONS set
tx_from=0
,base_lib="&lib"
,base_ds="MPE_XLMAP_INFO"
,base_col="XLMAP_ID"
,rule_type='CASE'
,rule_value='UPCASE'
,rule_active=1
,tx_to='31DEC5999:23:59:59'dt;
insert into &lib..MPE_VALIDATIONS set
tx_from=0
,base_lib="&lib"
,base_ds="MPE_XLMAP_RULES"
,base_col="XLMAP_ID"
,rule_type='CASE'
,rule_value='UPCASE'
,rule_active=1
,tx_to='31DEC5999:23:59:59'dt;
insert into &lib..MPE_VALIDATIONS set insert into &lib..MPE_VALIDATIONS set
tx_from=0 tx_from=0
,base_lib="&lib" ,base_lib="&lib"
@ -1640,6 +1901,16 @@ insert into &lib..MPE_VALIDATIONS set
,rule_value='1' ,rule_value='1'
,rule_active=1 ,rule_active=1
,tx_to='31DEC5999:23:59:59'dt; ,tx_to='31DEC5999:23:59:59'dt;
insert into &lib..MPE_VALIDATIONS set
tx_from=0
,base_lib="&lib"
,base_ds="MPE_XLMAP_INFO"
,base_col="XLMAP_ID"
,rule_type='SOFTSELECT'
,rule_value="&lib..MPE_XLMAP_RULES.XLMAP_ID"
,rule_active=1
,tx_to='31DEC5999:23:59:59'dt;
/** /**
* MPE_X_TEST * MPE_X_TEST

View File

@ -268,6 +268,55 @@ proc datasets lib=&lib noprint;
pk_mpe_excel_config=(tx_to xl_libref xl_table xl_column) pk_mpe_excel_config=(tx_to xl_libref xl_table xl_column)
/nomiss unique; /nomiss unique;
quit; quit;
proc sql;
create table &lib..MPE_XLMAP_DATA(
LOAD_REF char(32) &notnull,
XLMAP_ID char(32) &notnull,
XLMAP_RANGE_ID char(32) &notnull,
ROW_NO num &notnull,
COL_NO num &notnull,
VALUE_TXT char(4000)
);quit;
proc datasets lib=&lib noprint;
modify MPE_XLMAP_DATA;
index create
pk_MPE_XLMAP_DATA=(load_ref xlmap_id xlmap_range_id row_no col_no)
/nomiss unique;
quit;
proc sql;
create table &lib..mpe_xlmap_info(
tx_from num &notnull,
tx_to num &notnull,
XLMAP_ID char(32) &notnull,
XLMAP_DESCRIPTION char(1000) &notnull,
XLMAP_TARGETLIBDS char(41) &notnull
);quit;
proc datasets lib=&lib noprint;
modify mpe_xlmap_info;
index create
pk_mpe_xlmap_info=(tx_to xlmap_id)
/nomiss unique;
quit;
proc sql;
create table &lib..mpe_xlmap_rules(
tx_from num &notnull,
tx_to num &notnull,
XLMAP_ID char(32) &notnull,
XLMAP_RANGE_ID char(32) &notnull,
XLMAP_SHEET char(32) &notnull,
XLMAP_START char(1000) &notnull,
XLMAP_FINISH char(1000)
);quit;
proc datasets lib=&lib noprint;
modify mpe_xlmap_rules;
index create
pk_mpe_xlmap_rules=(tx_to xlmap_id xlmap_range_id)
/nomiss unique;
quit;
proc sql; proc sql;
create table &lib..mpe_filteranytable( create table &lib..mpe_filteranytable(
filter_rk num &notnull, filter_rk num &notnull,

View File

@ -0,0 +1,28 @@
/**
@file
@brief Validates excel map structure and adds load ref
@details Used in staging, prior to the post edit hook
@version 9.2
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd. This code may only be used within Data Controller
and may not be re-distributed or re-sold without the express permission of
4GL Apps Ltd.
**/
%macro mpe_xlmapvalidate(mperef,inds,dclib,tgtds);
%local ismap;
proc sql noprint;
select count(*) into: ismap
from &dclib..mpe_xlmap_info
where XLMAP_TARGETLIBDS="&tgtds" and &dc_dttmtfmt. le TX_TO ;
%if "&tgtds"="&dclib..MPE_XLMAP_DATA" or &ismap>0 %then %do;
data &inds;
set &inds;
LOAD_REF="&mperef";
run;
%end;
%mend mpe_xlmapvalidate;

View File

@ -0,0 +1,59 @@
/**
@file
@brief Testing mpe_xlmapvalidate macro
<h4> SAS Macros </h4>
@li mpe_xlmapvalidate.sas
@li mp_assert.sas
@li mp_assertscope.sas
<h4> SAS Includes </h4>
@li mpe_xlmap_data.ddl ul
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd. This code may only be used within Data Controller
and may not be re-distributed or re-sold without the express permission of
4GL Apps Ltd.
**/
/* create the table */
%let curlib=work;
proc sql;
%inc ul;
data work.test1;
if 0 then set work.MPE_XLMAP_DATA;
LOAD_REF='0';
XLMAP_ID='Sample';
XLMAP_RANGE_ID='Range 1';
ROW_NO=1;
COL_NO=2;
VALUE_TXT='something';
run;
%mp_assertscope(SNAPSHOT)
%mpe_xlmapvalidate(DCTEST1,work.test1,&dclib,NOT.MAP)
%mp_assertscope(COMPARE,
desc=Checking macro variables against previous snapshot
)
data _null_;
set work.test1;
call symputx('test1',load_ref);
run;
%mp_assert(
iftrue=(&test1=0),
desc=Checking load ref was not applied
)
%mpe_xlmapvalidate(DCTEST2,work.test1,&dclib,&dclib..MPE_XLMAP_DATA)
data _null_;
set work.test1;
call symputx('test2',load_ref);
run;
%mp_assert(
iftrue=(&test2=DCTEST2),
desc=Checking load ref was applied for default case
)

View File

@ -207,7 +207,7 @@
}, },
{ {
"name": "4gl", "name": "4gl",
"serverUrl": "https://sas9.4gl.io", "serverUrl": "https://sas.4gl.io",
"serverType": "SASJS", "serverType": "SASJS",
"httpsAgentOptions": { "httpsAgentOptions": {
"allowInsecureRequests": false "allowInsecureRequests": false

View File

@ -56,12 +56,12 @@
data _null_; data _null_;
set work.sascontroltable; set work.sascontroltable;
call symputx('ACTION',ACTION); call symputx('ACTION',ACTION);
call symputx('TABLE',TABLE); call symputx('LOAD_REF',TABLE);
/* DIFFTIME is when the DIFF was generated on the frontend */ /* DIFFTIME is when the DIFF was generated on the frontend */
call symputx('DIFFTIME',DIFFTIME); call symputx('DIFFTIME',DIFFTIME);
run; run;
%global action is_err err_msg; %global action is_err err_msg msg;
%let is_err=0; %let is_err=0;
%let user=%mf_getuser(); %let user=%mf_getuser();
@ -80,7 +80,7 @@ RUN;
%let isfmtcat=0; %let isfmtcat=0;
data APPROVE1; data APPROVE1;
set &mpelib..mpe_submit; set &mpelib..mpe_submit;
where TABLE_ID="&TABLE"; where TABLE_ID="&LOAD_REF";
/* fetch mpe_submit data */ /* fetch mpe_submit data */
libds=cats(base_lib,'.',base_ds); libds=cats(base_lib,'.',base_ds);
REVIEWED_ON=put(reviewed_on_dttm,datetime19.); REVIEWED_ON=put(reviewed_on_dttm,datetime19.);
@ -115,9 +115,9 @@ run;
) )
%mp_abort( %mp_abort(
iftrue=(%mf_verifymacvars(difftime orig_libds libds table)=0) iftrue=(%mf_verifymacvars(difftime orig_libds libds load_ref)=0)
,mac=&_program ,mac=&_program
,msg=%str(Missing: difftime orig_libds libds table) ,msg=%str(Missing: difftime orig_libds libds load_ref)
) )
/* security checks */ /* security checks */
@ -186,7 +186,7 @@ run;
%let prev_upload_check=1; %let prev_upload_check=1;
proc sql; proc sql;
select count(*) into: prev_upload_check from &mpelib..mpe_review select count(*) into: prev_upload_check from &mpelib..mpe_review
where TABLE_ID="&TABLE" and REVIEWED_BY_NM="&user" where TABLE_ID="&LOAD_REF" and REVIEWED_BY_NM="&user"
and REVIEW_STATUS_ID ne "SUBMITTED"; and REVIEW_STATUS_ID ne "SUBMITTED";
%let authcheck=%mf_getattrn(work.authAPP,NLOBS); %let authcheck=%mf_getattrn(work.authAPP,NLOBS);
%if &authcheck=0 or &prev_upload_check=1 %then %do; %if &authcheck=0 or &prev_upload_check=1 %then %do;
@ -233,7 +233,7 @@ run;
%else %let oldloc=%qsysfunc(getoption(LOG)); %else %let oldloc=%qsysfunc(getoption(LOG));
%if %length(&oldloc)>0 %then %do; %if %length(&oldloc)>0 %then %do;
proc printto proc printto
log="&mpelocapprovals/&TABLE/approval.log"; log="&mpelocapprovals/&LOAD_REF/approval.log";
run; run;
data _null_; data _null_;
if _n_=1 then do; if _n_=1 then do;
@ -247,7 +247,7 @@ run;
%end; %end;
%else %do; %else %do;
proc printto proc printto
log="&mpelocapprovals/&TABLE/approval.log"; log="&mpelocapprovals/&LOAD_REF/approval.log";
run; run;
%end; %end;
@ -285,11 +285,11 @@ select PRE_APPROVE_HOOK, POST_APPROVE_HOOK, LOADTYPE, var_txfrom, var_txto
,msg=%str(Missing: mpelocapprovals orig_libds) ,msg=%str(Missing: mpelocapprovals orig_libds)
) )
/* get dataset from approvals location */ /* get dataset from approvals location (has same name as load_ref) */
%let tmplib=%mf_getuniquelibref(); %let tmplib=%mf_getuniquelibref();
libname &tmplib "&mpelocapprovals/&TABLE"; libname &tmplib "&mpelocapprovals/&LOAD_REF";
data STAGING_DS; data STAGING_DS;
set &tmplib..&TABLE; set &tmplib..&LOAD_REF;
run; run;
%mp_abort(iftrue= (&syscc ne 0) %mp_abort(iftrue= (&syscc ne 0)
@ -313,7 +313,7 @@ run;
%let apprno=%eval(&num_of_approvals_required-&num_of_approvals_remaining+1); %let apprno=%eval(&num_of_approvals_required-&num_of_approvals_remaining+1);
data work.append_review; data work.append_review;
if 0 then set &mpelib..mpe_review; if 0 then set &mpelib..mpe_review;
TABLE_ID="&TABLE"; TABLE_ID="&LOAD_REF";
BASE_TABLE="&orig_libds"; BASE_TABLE="&orig_libds";
REVIEW_STATUS_ID="APPROVED"; REVIEW_STATUS_ID="APPROVED";
REVIEWED_BY_NM="&user"; REVIEWED_BY_NM="&user";
@ -323,7 +323,7 @@ run;
stop; stop;
run; run;
%mp_lockanytable(LOCK, %mp_lockanytable(LOCK,
lib=&mpelib,ds=mpe_review,ref=%str(&table Approval), lib=&mpelib,ds=mpe_review,ref=%str(&LOAD_REF Approval),
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
) )
proc append base=&mpelib..mpe_review data=work.append_review; proc append base=&mpelib..mpe_review data=work.append_review;
@ -335,7 +335,7 @@ run;
/* update mpe_submit table */ /* update mpe_submit table */
%mp_lockanytable(LOCK, %mp_lockanytable(LOCK,
lib=&mpelib,ds=mpe_submit,ref=%str(&table Approval), lib=&mpelib,ds=mpe_submit,ref=%str(&LOAD_REF Approval),
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
) )
proc sql; proc sql;
@ -343,7 +343,7 @@ run;
set num_of_approvals_remaining=&num_of_approvals_remaining-1, set num_of_approvals_remaining=&num_of_approvals_remaining-1,
reviewed_by_nm="&user", reviewed_by_nm="&user",
reviewed_on_dttm=&sastime reviewed_on_dttm=&sastime
where table_id="&table"; where table_id="&LOAD_REF";
%mp_lockanytable(UNLOCK, %mp_lockanytable(UNLOCK,
lib=&mpelib,ds=mpe_submit, lib=&mpelib,ds=mpe_submit,
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
@ -369,7 +369,7 @@ run;
) )
%mpe_targetloader(libds=&orig_libds %mpe_targetloader(libds=&orig_libds
,now= &sastime ,now= &sastime
,etlsource=&TABLE ,etlsource=&LOAD_REF
,STAGING_DS=STAGING_DS ,STAGING_DS=STAGING_DS
,dclib=&mpelib ,dclib=&mpelib
%if &action=APPROVE_TABLE %then %do; %if &action=APPROVE_TABLE %then %do;
@ -405,7 +405,7 @@ run;
proc sql noprint; proc sql noprint;
select max(processed_dttm)-1 format=datetime19. into: tstamp select max(processed_dttm)-1 format=datetime19. into: tstamp
from &mpelib..mpe_dataloads from &mpelib..mpe_dataloads
where libref="&libref" and dsn="&ds" and ETLSOURCE="&TABLE"; where libref="&libref" and dsn="&ds" and ETLSOURCE="&LOAD_REF";
quit; quit;
%if &tstamp=. %then %let tstamp=%sysfunc(datetime(),datetime19.); %if &tstamp=. %then %let tstamp=%sysfunc(datetime(),datetime19.);
@ -498,7 +498,7 @@ run;
else if _____orig then _____status='ORIGINAL'; else if _____orig then _____status='ORIGINAL';
run; run;
proc export data=TEMPDIFFS dbms=csv replace proc export data=TEMPDIFFS dbms=csv replace
outfile="&mpelocapprovals/&TABLE/&tempDIFFS_CSV" ; outfile="&mpelocapprovals/&LOAD_REF/&tempDIFFS_CSV" ;
run; run;
proc sql noprint; proc sql noprint;
select filesize format=sizekmg10.1, filesize as filesize_raw select filesize format=sizekmg10.1, filesize as filesize_raw
@ -545,7 +545,7 @@ run;
proc sort data=&mpelib..mpe_submit(where=( proc sort data=&mpelib..mpe_submit(where=(
submit_status_cd='SUBMITTED' submit_status_cd='SUBMITTED'
and cats(base_lib,'.',base_ds)="&orig_libds" and cats(base_lib,'.',base_ds)="&orig_libds"
and table_id ne "&TABLE" and table_id ne "&LOAD_REF"
)) out=submits; )) out=submits;
by descending submitted_on_dttm; by descending submitted_on_dttm;
run; run;
@ -599,7 +599,7 @@ run;
data work.outds_mod; run; data work.outds_mod; run;
data work.outds_del; run; data work.outds_del; run;
%end; %end;
libname approve "&mpelocapprovals/&TABLE"; libname approve "&mpelocapprovals/&LOAD_REF";
data; set &libds;stop;run; data; set &libds;stop;run;
%let emptybasetable=&syslast; %let emptybasetable=&syslast;
data approve.ActualDiffs; data approve.ActualDiffs;
@ -621,7 +621,7 @@ run;
run; run;
proc export data=approve.ActualDiffs proc export data=approve.ActualDiffs
outfile="&mpelocapprovals/&TABLE/ActualDiffs.csv" outfile="&mpelocapprovals/&LOAD_REF/ActualDiffs.csv"
dbms=csv dbms=csv
replace; replace;
run; run;
@ -631,7 +631,7 @@ run;
%let apprno=%eval(&num_of_approvals_required-&num_of_approvals_remaining+1); %let apprno=%eval(&num_of_approvals_required-&num_of_approvals_remaining+1);
data work.append_review; data work.append_review;
if 0 then set &mpelib..mpe_review; if 0 then set &mpelib..mpe_review;
TABLE_ID="&TABLE"; TABLE_ID="&LOAD_REF";
BASE_TABLE="&orig_libds"; BASE_TABLE="&orig_libds";
REVIEW_STATUS_ID="APPROVED"; REVIEW_STATUS_ID="APPROVED";
REVIEWED_BY_NM="&user"; REVIEWED_BY_NM="&user";
@ -641,7 +641,7 @@ run;
stop; stop;
run; run;
%mp_lockanytable(LOCK, %mp_lockanytable(LOCK,
lib=&mpelib,ds=mpe_review,ref=%str(&table Approval), lib=&mpelib,ds=mpe_review,ref=%str(&LOAD_REF Approval),
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
) )
proc append base=&mpelib..mpe_review data=work.append_review; proc append base=&mpelib..mpe_review data=work.append_review;
@ -653,7 +653,7 @@ run;
/* update mpe_submit table */ /* update mpe_submit table */
%mp_lockanytable(LOCK, %mp_lockanytable(LOCK,
lib=&mpelib,ds=mpe_submit,ref=%str(&table Approval in auditors/postdata), lib=&mpelib,ds=mpe_submit,ref=%str(&LOAD_REF Approval in auditors/postdata),
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
) )
proc sql; proc sql;
@ -662,7 +662,7 @@ run;
num_of_approvals_remaining=&num_of_approvals_remaining-1, num_of_approvals_remaining=&num_of_approvals_remaining-1,
reviewed_by_nm="&user", reviewed_by_nm="&user",
reviewed_on_dttm=&sastime reviewed_on_dttm=&sastime
where table_id="&table"; where table_id="&LOAD_REF";
%mp_lockanytable(UNLOCK, %mp_lockanytable(UNLOCK,
lib=&mpelib,ds=mpe_submit, lib=&mpelib,ds=mpe_submit,
ctl_ds=&mpelib..mpe_lockanytable ctl_ds=&mpelib..mpe_lockanytable
@ -688,7 +688,7 @@ run;
%mpe_alerts(alert_event=APPROVED %mpe_alerts(alert_event=APPROVED
, alert_lib=&libref , alert_lib=&libref
, alert_ds=&ds , alert_ds=&ds
, dsid=&TABLE , dsid=&LOAD_REF
) )
%removecolsfromwork(___TMP___MD5) %removecolsfromwork(___TMP___MD5)

View File

@ -34,7 +34,8 @@ run;
%mp_testservice(&_program, %mp_testservice(&_program,
viyacontext=&defaultcontext, viyacontext=&defaultcontext,
inputdatasets=work.sascontroltable work.jsdata, inputdatasets=work.sascontroltable work.jsdata,
outlib=web1 outlib=web1,
mdebug=&sasjs_mdebug
) )
%let status=0; %let status=0;

View File

@ -14,8 +14,9 @@
<h5> sasdata </h5> <h5> sasdata </h5>
<h5> sasparams </h5> <h5> sasparams </h5>
Contains info on the request. One row is returned. Contains info on the request. One row is returned.
* CLS_FLG - set to 0 if there are no CLS rules (everything should be editable) @li CLS_FLG - set to 0 if there are no CLS rules (everything should be editable)
else set to 1 (CLS rules exist) else set to 1 (CLS rules exist)
@li ISMAP - set to 1 if the target DS is an excel map target, else 0
<h5> approvers </h5> <h5> approvers </h5>
<h5> dqrules </h5> <h5> dqrules </h5>
@ -534,6 +535,11 @@ data _null_;
run; run;
%put params; %put params;
%let ismap=0;
proc sql noprint;
select count(*) into: ismap from &mpelib..mpe_xlmap_info
where XLMAP_TARGETLIBDS="&orig_libds" and &dc_dttmtfmt. le TX_TO;
data sasparams; data sasparams;
length colHeaders $20000 filter_text $32767; length colHeaders $20000 filter_text $32767;
colHeaders=cats(upcase("%mf_getvarlist(sasdata1,dlm=%str(,))")); colHeaders=cats(upcase("%mf_getvarlist(sasdata1,dlm=%str(,))"));
@ -551,8 +557,11 @@ data sasparams;
if %mf_nobs(work.cls_rules)=0 then cls_flag=0; if %mf_nobs(work.cls_rules)=0 then cls_flag=0;
else cls_flag=1; else cls_flag=1;
put (_all_)(=); put (_all_)(=);
if "&orig_libds"="&mpelib..MPE_XLMAP_DATA" or &ismap ne 0 then ismap=1;
else ismap=0;
run; run;
/* Extract validation DQ Rules */ /* Extract validation DQ Rules */
proc sort data=&mpelib..mpe_validations proc sort data=&mpelib..mpe_validations
(where=(&dc_dttmtfmt. le TX_TO (where=(&dc_dttmtfmt. le TX_TO
@ -639,8 +648,6 @@ proc sort data=dqdata;
by base_col selectbox_order; by base_col selectbox_order;
run; run;
%mp_getmaxvarlengths(work.sasdata1,outds=maxvarlengths) %mp_getmaxvarlengths(work.sasdata1,outds=maxvarlengths)
data maxvarlengths; data maxvarlengths;

View File

@ -141,13 +141,15 @@ run;
data work.fmts; data work.fmts;
length fmtname $32; length fmtname $32;
fmtname="&fmtname"; fmtname="&fmtname";
type='N';
do start=1 to 10; do start=1 to 10;
label= cats("&fmtname",start); label= cats("&fmtname",start);
end=start;
output; output;
end; end;
run; run;
proc sort data=work.fmts nodupkey; proc sort data=work.fmts nodupkey;
by fmtname; by fmtname type start;
run; run;
proc format cntlin=work.fmts library=dctest.dcfmts; proc format cntlin=work.fmts library=dctest.dcfmts;
run; run;
@ -157,8 +159,9 @@ data work.inquery3;
infile datalines4 dsd; infile datalines4 dsd;
input GROUP_LOGIC:$3. SUBGROUP_LOGIC:$3. SUBGROUP_ID:8. VARIABLE_NM:$32. input GROUP_LOGIC:$3. SUBGROUP_LOGIC:$3. SUBGROUP_ID:8. VARIABLE_NM:$32.
OPERATOR_NM:$10. RAW_VALUE:$4000.; OPERATOR_NM:$10. RAW_VALUE:$4000.;
RAW_VALUE="'&fmtname'";
datalines4; datalines4;
AND,AND,1,FMTNAME,CONTAINS,"'&fmtname'" AND,AND,1,FMTNAME,CONTAINS,placeholder (see line above)
;;;; ;;;;
run; run;
%mp_filterstore( %mp_filterstore(

View File

@ -0,0 +1,82 @@
/**
@file getxlmaps.sas
@brief Returns a list of rules and other info for a specific xlmap_id
<h4> Service Inputs </h4>
<h5> getxlmaps_in </h5>
|XLMAP_ID|
|---|
|Sample|
<h4> Service Outputs </h4>
<h5> xlmaprules </h5>
Filtered output of the dc.MPE_XLMAP_RULES table
|XLMAP_ID|XLMAP_RANGE_ID|XLMAP_SHEET|XLMAP_START|XLMAP_FINISH|
|---|---|---|---|---|
|Sample|Range1|Sheet1|ABSOLUTE A1| |
|Sample|Range2|Sheet1|RELATIVE R[2]C[2]|ABSOLUTE H11|
<h5> xlmapinfo </h5>
Extra info for a map id
|TARGET_DS|
|---|
|DCXXX.MPE_XLMAP_DATA|
<h4> SAS Macros </h4>
@li mp_abort.sas
@li mpeinit.sas
@version 9.3
@author 4GL Apps Ltd
@copyright 4GL Apps Ltd. This code may only be used within Data Controller
and may not be re-distributed or re-sold without the express permission of
4GL Apps Ltd.
**/
%mpeinit()
data _null_;
set work.getxlmaps_in;
putlog (_all_)(=);
call symputx('xlmap_id',xlmap_id);
run;
proc sql noprint;
create table work.xlmaprules as
select xlmap_id
,XLMAP_RANGE_ID
,XLMAP_SHEET
,XLMAP_START
,XLMAP_FINISH
from &mpelib..MPE_XLMAP_RULES
where &dc_dttmtfmt. lt tx_to and xlmap_id="&xlmap_id"
order by xlmap_sheet, xlmap_range_id;
%global target_ds;
select XLMAP_TARGETLIBDS into: target_ds
from &mpelib..MPE_XLMAP_INFO
where &dc_dttmtfmt. lt tx_to and xlmap_id="&xlmap_id";
%mp_abort(iftrue= (&syscc ne 0)
,mac=&_program..sas
,msg=%str(syscc=&syscc)
)
data work.xlmapinfo;
target_ds=coalescec("&target_ds","&mpelib..MPE_XLMAP_DATA");
output;
stop;
run;
%webout(OPEN)
%webout(OBJ,xlmaprules)
%webout(OBJ,xlmapinfo)
%webout(CLOSE)

View File

@ -0,0 +1,58 @@
/**
@file
@brief testing getxlmaps service
<h4> SAS Macros </h4>
@li mf_getuniquefileref.sas
@li mx_testservice.sas
@li mp_assert.sas
@li mp_assertdsobs.sas
**/
%let _program=&appLoc/services/editors/getxlmaps;
/**
* Test 1 - basic send
*/
%let f1=%mf_getuniquefileref();
data _null_;
file &f1 termstr=crlf;
put 'XLMAP_ID:$char12.';
put "Sample";
run;
%mx_testservice(&_program,
viyacontext=&defaultcontext,
inputfiles=&f1:getxlmaps_in,
outlib=web1,
mdebug=&sasjs_mdebug
)
data work.xlmaprules;
set web1.xlmaprules;
putlog (_all_)(=);
run;
%mp_assertdsobs(work.xlmaprules,
test=ATLEAST 2,
desc=Checking successful return of at least 2 rules for the Sample map,
outds=work.test_results
)
/**
* Test 2 - info returned
*/
data work.xlmapinfo;
set web1.xlmapinfo;
putlog (_all_)(=);
call symputx('tgtds',target_ds);
run;
%mp_assert(
iftrue=(&tgtds=&dclib..MPE_XLMAP_DATA),
desc=Checking correct target table is returned,
outds=work.test_results
)

View File

@ -146,7 +146,7 @@ select count(*) into: nobs from &syslast;
,msg=%str(Issue assigning library &orig_lib) ,msg=%str(Issue assigning library &orig_lib)
) )
%global txfrom txto processed; %global txfrom txto processed rk;
data _null_; data _null_;
set &mpelib..MPE_TABLES; set &mpelib..MPE_TABLES;
@ -154,12 +154,13 @@ data _null_;
call symputx('txfrom',var_txfrom); call symputx('txfrom',var_txfrom);
call symputx('txto',var_txto); call symputx('txto',var_txto);
call symputx('processed',var_processed); call symputx('processed',var_processed);
if not missing(RK_UNDERLYING) then call symputx('rk',buskey);
run; run;
%mp_lockfilecheck(libds=&orig_libds) %mp_lockfilecheck(libds=&orig_libds)
data compare; data compare;
set &libds(drop=&txfrom &txto &processed); set &libds(drop=&txfrom &txto &processed &rk);
stop; stop;
run; run;

View File

@ -24,6 +24,7 @@ proc format lib=DCTEST.DCFMTS cntlout=work.fmtextract;
run; run;
data work.jsdata; data work.jsdata;
set work.fmtextract; set work.fmtextract;
fmtrow=_n_;
if _n_<5 then _____DELETE__THIS__RECORD_____='Yes'; if _n_<5 then _____DELETE__THIS__RECORD_____='Yes';
else _____DELETE__THIS__RECORD_____='No'; else _____DELETE__THIS__RECORD_____='No';
if _n_>20 then stop; if _n_>20 then stop;

View File

@ -34,6 +34,7 @@ data work.staging_ds;
var_processed=upcase(var_processed); var_processed=upcase(var_processed);
close_vars=upcase(close_vars); close_vars=upcase(close_vars);
audit_libds=upcase(audit_libds); audit_libds=upcase(audit_libds);
rk_underlying=upcase(rk_underlying);
/* check for valid loadtype */ /* check for valid loadtype */
if LOADTYPE not in ('UPDATE','TXTEMPORAL','FORMAT_CAT','BITEMPORAL','REPLACE') if LOADTYPE not in ('UPDATE','TXTEMPORAL','FORMAT_CAT','BITEMPORAL','REPLACE')
@ -45,8 +46,12 @@ data work.staging_ds;
/* force correct BUSKEY and DSN when loading format catalogs */ /* force correct BUSKEY and DSN when loading format catalogs */
if LOADTYPE='FORMAT_CAT' then do; if LOADTYPE='FORMAT_CAT' then do;
BUSKEY='TYPE FMTNAME FMTROW'; BUSKEY='TYPE FMTNAME FMTROW';
if subpad(dsn,length(dsn)-3,3) ne '-FC' then dsn=cats(dsn,'-FC'); DSN=scan(dsn,1,'-')!!'-FC';
end; end;
/* convert tabs into spaces */
buskey=translate(buskey," ","09"x);
rk_underlying=translate(rk_underlying," ","09"x);
run; run;
%mp_abort(iftrue=(&errflag=1) %mp_abort(iftrue=(&errflag=1)

View File

@ -0,0 +1,69 @@
/**
@file
@brief Post Edit Hook script for the MPE_XLMAP_INFO table
@details Post edit hooks provide additional backend validation for user
provided data. The incoming dataset is named `work.staging_ds` and is
provided in mpe_loader.sas.
Available macro variables:
@li DC_LIBREF - The DC control library
@li LIBREF - The library of the dataset being edited (is assigned)
@li DS - The dataset being edited
<h4> SAS Macros </h4>
@li mf_existds.sas
@li mf_getvarlist.sas
@li mf_wordsinstr1butnotstr2.sas
@li dc_assignlib.sas
@li mp_validatecol.sas
**/
data work.staging_ds;
set work.staging_ds;
/* apply the first excel map to all cells */
length tgtds $41;
retain tgtds;
drop tgtds is_libds;
if _n_=1 then do;
if missing(XLMAP_TARGETLIBDS) then tgtds="&dc_libref..MPE_XLMAP_DATA";
else tgtds=upcase(XLMAP_TARGETLIBDS);
%mp_validatecol(XLMAP_TARGETLIBDS,LIBDS,is_libds)
call symputx('tgtds',tgtds);
call symputx('is_libds',is_libds);
end;
XLMAP_TARGETLIBDS=tgtds;
run;
%mp_abort(iftrue=(&is_libds ne 1)
,mac=mpe_xlmap_info_postedit
,msg=Invalid target dataset (&tgtds)
)
/**
* make sure that the supplied target dataset exists and
* has the necessary columns
*/
%dc_assignlib(READ,%scan(&tgtds,1,.))
%mp_abort(iftrue=(%mf_existds(libds=&tgtds) ne 1)
,mac=mpe_xlmap_info_postedit
,msg=Target dataset (&tgtds) could not be opened
)
%let tgtvars=%upcase(%mf_getvarlist(&tgtds));
%let srcvars=%upcase(%mf_getvarlist(&dc_libref..MPE_XLMAP_DATA));
%let badvars1=%mf_wordsInStr1ButNotStr2(Str1=&srcvars,Str2=&tgtvars);
%let badvars2=%mf_wordsInStr1ButNotStr2(Str1=&tgtvars,Str2=&srcvars);
%mp_abort(iftrue=(%length(&badvars1.X)>1)
,mac=mpe_xlmap_info_postedit
,msg=%str(Target dataset (&tgtds) has missing vars: &badvars1)
)
%mp_abort(iftrue=(%length(&badvars2.X)>1)
,mac=mpe_xlmap_info_postedit
,msg=%str(Target dataset (&tgtds) has unrecognised vars: &badvars2)
)

View File

@ -0,0 +1,23 @@
/**
@file
@brief Post Edit Hook script for the MPE_XLMAP_RULES table
@details Post edit hooks provide additional backend validation for user
provided data. The incoming dataset is named `work.staging_ds` and is
provided in mpe_loader.sas.
Available macro variables:
@li DC_LIBREF - The DC control library
@li LIBREF - The library of the dataset being edited (is assigned)
@li DS - The dataset being edited
**/
data work.staging_ds;
set work.staging_ds;
/* ensure uppercasing */
XLMAP_ID=upcase(XLMAP_ID);
run;

View File

@ -0,0 +1,22 @@
/**
@file
@brief Sample XLMAP Data hook program (sample_xlmap_data_postapprove)
@details This hook script should NOT be modified in place, as the changes
would be lost in your next Data Controller deployment.
Instead, create a copy of this hook script and place it OUTSIDE the
Data Controller metadata folder.
Available macro variables:
@li LOAD_REF - The Load Reference (unique upload id)
@li ORIG_LIBDS - The target library.dataset that was just loaded
**/
data _null_;
set work.staging_ds;
putlog 'load ref is in the staged data: ' load_ref;
stop;
run;
%put the unique identifier (LOAD_REF) is also a macro variable: &LOAD_REF;

View File

@ -0,0 +1,49 @@
/**
@file
@brief Sample XLMAP Data hook program
@details This hook script should NOT be modified in place, as the changes
would be lost in your next Data Controller deployment.
Instead, create a copy of this hook script and place it OUTSIDE the
Data Controller metadata folder.
Available macro variables:
@li DC_LIBREF - The DC control library
@li LIBREF - The library of the dataset being edited (is assigned)
@li DS - The target dataset being loaded
**/
%let abort=0;
%let errmsg=;
data work.staging_ds;
set work.staging_ds;
length errmsg $1000;
drop err:;
/* KM1 validations */
if XLMAP_ID='BASEL-KM1' then do;
if XLMAP_RANGE_ID='KM1:a' & input(value_txt,8.)<100 then do;
errmsg='Should be greater than 100';
err=1;
end;
end;
/* CR2 Validations */
if XLMAP_ID='BASEL-CR2' then do;
if XLMAP_RANGE_ID='CR2-sec1' & row_no=3 & input(value_txt,8.)>0 then do;
errmsg='Should be negative';
err=1;
end;
end;
/* publish error message */
if err=1 then do;
errmsg=catx(' ',xlmap_range_id,':',value_txt,'->',errmsg);
call symputx('errmsg',errmsg);
call symputx('abort',1);
end;
run;
%mp_abort(iftrue=(&abort ne 0)
,mac=xlmap_data_postedit
,msg=%superq(errmsg)
)

View File

@ -3,10 +3,8 @@
@brief List the libraries and tables the mp-editor user can access @brief List the libraries and tables the mp-editor user can access
@details If user is in a control group (&mpeadmins, configured in mpeinit.sas) @details If user is in a control group (&mpeadmins, configured in mpeinit.sas)
then they have access to all libraries / tables. Otherwise a join is made then they have access to all libraries / tables. Otherwise a join is made
to the &mpelib..mp_editor_access table. to the &mpelib..mpe_security table.
This service is also callable from EUCs - just add EUCDLM= parameter.
EUCDLM values: TAB or CSV
<h4> SAS Macros </h4> <h4> SAS Macros </h4>
@li mf_getuser.sas @li mf_getuser.sas
@ -129,10 +127,26 @@ create table saslibs as
,msg=%str(issue with security validation) ,msg=%str(issue with security validation)
) )
proc sql;
create table work.xlmaps as
select distinct a.XLMAP_ID
,b.XLMAP_DESCRIPTION
,coalescec(b.XLMAP_TARGETLIBDS,"&mpelib..MPE_XLMAP_DATA")
as XLMAP_TARGETLIBDS
from &mpelib..MPE_XLMAP_RULES a
left join &mpelib..MPE_XLMAP_INFO(where=(&dc_dttmtfmt. lt tx_to)) b
on a.XLMAP_ID=b.XLMAP_ID
where &dc_dttmtfmt. lt a.tx_to;
/* we don't want the XLMAP target datasets to be directly editable */
delete from sasdatasets
where cats(libref,'.',dsn) in (select XLMAP_TARGETLIBDS from xlmaps);
%webout(OPEN) %webout(OPEN)
%webout(OBJ,sasDatasets) %webout(OBJ,sasDatasets)
%webout(OBJ,saslibs) %webout(OBJ,saslibs)
%webout(OBJ,globvars) %webout(OBJ,globvars)
%webout(ARR,xlmaps)
%webout(CLOSE) %webout(CLOSE)
%mpeterm() %mpeterm()

View File

@ -16,13 +16,24 @@
) )
data globvars; data work.globvars;
set webout.globvars; set webout.globvars;
putlog (_all_)(=); putlog (_all_)(=);
run; run;
data work.xlmaps;
set webout.xlmaps;
putlog (_all_)(=);
run;
%mp_assertdsobs(work.globvars, %mp_assertdsobs(work.globvars,
desc=Fromsas table returned, desc=Fromsas table returned,
test=HASOBS, test=HASOBS,
outds=work.test_results outds=work.test_results
) )
%mp_assertdsobs(work.xlmaps,
desc=xlmaps table returned,
test=HASOBS,
outds=work.test_results
)

View File

@ -290,6 +290,7 @@ run;
run; run;
data work.groups; data work.groups;
length groupuri groupname $32 groupdesc $128 ; length groupuri groupname $32 groupdesc $128 ;
call missing (of _all_);
output; output;
stop; stop;
run; run;