This commit is contained in:
Mihajlo Medjedovic 2024-06-05 15:53:46 +02:00
commit fd510b88c4
270 changed files with 27578 additions and 0 deletions

31
.github/workflows/publish.yml vendored Normal file
View File

@ -0,0 +1,31 @@
name: Gatsby Publish
on:
push:
branches: main
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [14.17.0]
steps:
- uses: actions/checkout@v1
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node-version }}
cache: npm
- name: Check prettier offences
run: npm run lint
- uses: enriikke/gatsby-gh-pages-action@v2
with:
access-token: ${{ secrets.DCSITE }}
deploy-branch: gh-pages
gatsby-args: --prefix-paths

5
.gitignore vendored Normal file
View File

@ -0,0 +1,5 @@
node_modules/
.cache/
public
.DS_Store
fsedit/

9
.gitpod.yml Normal file
View File

@ -0,0 +1,9 @@
# This configuration file was automatically generated by Gitpod.
# Please adjust to your needs (see https://www.gitpod.io/docs/config-gitpod-file)
# and commit this file to your remote git repository to share the goodness with others.
tasks:
- init: npm install && npm run build
command: npm run start

1
.nvmrc Normal file
View File

@ -0,0 +1 @@
v14.17.0

7
.prettierrc Normal file
View File

@ -0,0 +1,7 @@
{
"trailingComma": "none",
"tabWidth": 2,
"semi": false,
"singleQuote": true,
"endOfLine": "auto"
}

32
README.md Normal file
View File

@ -0,0 +1,32 @@
## 🚀 Quick start
1. **Gatsby Cli**
This app is built with Gatsby CLI version: 2.12.59
2. **Start developing.**
Navigate into your sites directory and start it up.
```shell
cd datacontroller.io/
npm run develop
```
3. **Open the code and start customizing!**
Your site is now running at http://localhost:8000!
4. **Learn more**
[Embed Video](https://www.gatsbyjs.com/plugins/gatsby-remark-embed-video/)
Examples:
- `video: [VideoTitle](https://www.youtube.com/embed/2Xc9gXyf2G4)`
- `youtube: https://www.youtube.com/watch?v=XrK3hmYO4ag`
- `vimeo: https://vimeo.com/417808409`
If want to add inline image with text left align / right align use following syntax, and place image to /static
- <img class="alignright" src="/wp-content/uploads/2021/04/2IrsV7v.png" alt="Title" width="352" height="442" />
- <img class="alignleft" src="/wp-content/uploads/2021/04/2IrsV7v.png" alt="Title" width="352" height="442" />

Binary file not shown.

After

Width:  |  Height:  |  Size: 377 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 163 KiB

View File

@ -0,0 +1,70 @@
---
title: v3.12 Release Four New Data Management Features
description: Four fantastic new features have been added to Data Controller - row level security, dynamic cell dropdowns, excel formula support and dynamic filtering.
date: '2021-05-13 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './cell_validation1.png'
tags:
- Data Quality
- End User Computing
- Excel
- Releases
---
Thanks to a customer investment, we have four fantastic new features!  These are:
- Dynamic Filtering (returned values filtered by other clauses in the same filter)
- Ability to upload Excel formulas into SAS
- Row Level Security for all tables in SAS
- Dynamic Cell Dropdown
That last feature (dynamic cell dropdown) is turning out to be Data Quality Dynamite.  Let's start with it.
## Dynamic Cell Dropdown
This feature allows you to create a backend SAS program to generate the values for the cell dropdown.  The program receives the ROW as input, so you can make the values dependent on other values - such as returning a list of product codes for a particular region, or people within a department.  We are now using it extensively within Data Controller to display tables for a particular library, or columns for a particular table.
The SAS program can live on the directory, or it can also be in a SAS 9 Stored Process or Viya Job.  The dropdown can be "HARD" (user must select a value) or "SOFT" (user can also type their own value).
![Dynamic Cells in SAS](./cell_validation1.png)
More info in [documentation](https://docs.datacontroller.io/dynamic-cell-dropdown).
## Row Level Security
This feature allows you to create complex queries to determine which groups can access which rows in which tables (either for the EDIT menu, VIEW menu or both).  If a user is in multiple groups, these rules are joined with an OR condition, allowing additional rows with additional group memberships.  The rules are also applied for data uploads, preventing users from modifying records that they do not have permission to access.
![Row Level Security in Data Controller for SAS](rls_table.png)
Full details available in the [documentation](https://docs.datacontroller.io/row-level-security).
## Dynamic Filtering
Previously, when using the filter mechanism, all values were always returned in the selection box.
![Data Controller for SAS previous selection box](filter_dynamic_off.png)
Now, when the dynamic where clause box is checked (which is the default), the values in the _second and subsequent_ filter clauses are filtered by the previous filter clause settings, eg:
![](filter_dynamic_on.png)
## Excel Formulas
It is now possible to configure certain columns to be extracted as formulae, rather than raw values. The target column must be character, and it should be wide enough to support the longest formula in the source data. If the order of values is important, you should include a row number in your primary key.
`video: [Retain Formulas when Loading Excel to SAS](https://www.youtube-nocookie.com/embed/Reg803vI2Ak)`
Full configuration information in the [docs](https://docs.datacontroller.io/excel).
## Other Stuff
Further updates since the [v3.11 release](/version-3-11-release-notes-redshift-locale-proc-transpose) include:
- Support for E8601DA and B8601DA date formats
- Addition of a max-depth option in Data Lineage to enable exploration when the lineage is HUGE
- Optimisations to enable rendering of large lineage diagrams, and options to perform background rendering into PNG / SVG files
- Fixed bug where some restricted users could view all approve history
- Fixed bug where users could not see their list of submits
- Excel Exports are now enabled where SAS/ACCESS for PC Files is not licensed
- Previously, not all Viya Users were being returned in the User Navigator.  They are now.
Would you like to give Data Controller a whirl?  We're waiting to [hear from you](/contact)!

Binary file not shown.

After

Width:  |  Height:  |  Size: 700 KiB

View File

@ -0,0 +1,47 @@
---
title: "v3.13 Release: Extended Data Validation and Native Postgres Support"
description: Data Controller now provides additional support for creating dynamic cell dropdowns as well as native Postgres support.
date: '2021-09-06 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './validation1.png'
tags:
- Data Quality
- Releases
- SAS
---
This release contains a number of small fixes and UI improvements, and two major features:
* Extended Data Validations
* Native Postgres Support
## Extended Data Validations
In the [previous release](/3-12-four-new-data-management-features) we provided a feature that allows a SAS developer to create a program (using the source row as input) to determine the values of a particular dropdown.
This feature has now been extended, to allow the response to contain dropdowns for other cells in the same row. Default values can also be provided for each additional dropdown.
`video: [Retain Formulas when Loading Excel to SAS](https://www.youtube-nocookie.com/embed/rmES77aIr90)`
To make this work, the SAS developer simply needs to write a SAS program that takes a source table named `work.source_row` (the row being edited) and creates two output tables:
* `work.dynamic_values` - the first dropdown
* `work.dynamic_extended_values` - the additional dropdowns, and any defaults.
Your SAS Program (hook script) can be a file on the filesystem (in which case it must end with ".sas") or it can also be a Stored Process or Viya Job in the logical folder tree (metadata or SAS Drive) - in which case it must _not_ end with ".sas". In both cases you should provide the full path and filename in the MPE_VALIDATIONS table.
More info in [documentation](https://docs.datacontroller.io/dynamic-cell-dropdown).
## Native Postgres Support
Alongside Redshift and SQL Server we now have native support for Postgres. What does this mean?
Thanks to SAS/ACCESS engines, we can automatically support a very wide range of database engines. However load times can become significant when the target contains millions (or billions) of rows.
In order to provide "native" support we update our load process to 'inject' a temporary table using SQL passthrough, which results in significantly faster updates for certain load types, such as SCD2.
<hr>
Did you know Data Controller (Community Edition) is free, for unlimited users? [Contact us](/contact) for your copy!

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 898 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 49 KiB

View File

@ -0,0 +1,126 @@
---
title: Five Zero-Code ways to Import Excel into SAS
description: Five zero-code ways to import Excel into SAS - be that on Viya, SAS 9 with Metadata, or good old Base SAS on your desktop.
date: '2021-04-18 10:59:18'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './dcgrid.png'
tags:
- Data Management
- EUC
- Excel
---
Your data is in Excel and you need to import it into SAS. You googled, and discovered 5000 different methods. Which to choose? We compare and contrast 5 approaches to this perennial problem. Before we get onto that though - <strong>why</strong> is it such a problem?
The crux of the issue is: <h2>Flexibility vs Scalability</h2>
Excel, as you probably know, is incredibly flexible. Data can spread in all directions, move around, be positioned anywhere, on any cell, of any worksheet in a workbook. That workbook can have different names, exist in different locations, be of different types (xls, xlsx, xlsm). That's before we get down to whether the data arrives as values, formatted values, formulas, or [other dragons](https://www.linkedin.com/posts/allanbowe_data-engineers-in-your-experience-what-activity-6771408875544461312-Weqt). And the fact that, as it is typically stored on a shared filesystem, it can be changed by anyone, at any time.
SAS is far less flexible in this regard. Data is nearly always structured in a table, with fixed columns, of fixed data types, in a fixed library / location, with a fixed name (or naming convention). That table usually lives on a server, perhaps in a database. This rigidity is actually a Very. Good. Thing. It provides consistency, which is the basis for scalability. And the basis of the consistency is how the data is modelled.
## The Data Model
Every table in SAS contains some kind of metadata about how the data is structured - the column names, types (character vs numeric), formats (dates, currency), lengths, encoding (UTF8 vs WLATIN1) and more. The first question you need to ask yourself, when loading Excel data into SAS, is - do I take the model from Excel? Or am I targeting an existing model in SAS?
Speaking of SAS. The word "SAS" can mean so many things - do we mean the language? The platform? The company? A specific product? Let's break this down, as the choice of tool will depend on the type of "SAS" you have.
## Which Flavour of SAS do I have?
The world of SAS can be broken into 3 major platforms:
- Base SAS. Traditional SAS, typically installed on your desktop.
- SAS Meta. An enterprise deployment with mid-tier and metadata server.
- SAS Viya. Cloud native, API driven microservices architecture.
The options available to you for importing Excel will vary depending on the flavour you are using. How do you know which one you have? Try running the following code in SAS:
```sas
filename mc url "https://raw.githubusercontent.com/sasjs/core/main/all.sas";
%inc mc;
%put %mf_getplatform();
```
The entry in the log will tell you if your environment is BASESAS, SASMETA or SASVIYA.
## Importing Excel into SAS
Without further ado, let's explore the options available!
### 1 - Import Excel with Data Controller for SAS
It's <em>super easy</em> to import an arbitrary Excel file to an <strong>existing</strong> table using Data Controller for SAS. You simply choose the table you'd like to modify, then drag your Excel file into the browser.
Data Controller will scan every worksheet in your Excel file to find a range that matches the target table. How does it do that? The top of the range is identified by simply checking for a <em>row</em> that contains<em> all columns</em> as per the target table definition, whereas the bottom of the range is simply the first blank primary key value.
Once the range is found, Data Controller will perform a series of automatic checks and validations, and import the data and a copy of the workbook. If all the checks pass, it's one button click to load the data (and corresponding workbook) to the SAS server and notify the approver that a change request has been submitted.
`vimeo: https://vimeo.com/417808409`
Once the approver approves the change, the table is updated. There is a full audit history and you can even retrieve the original excel file that was submitted. The tool does NOT let you load <em>new</em> tables to SAS, nor does it let you modify the columns. The <em>model</em> is therefore protected by IT, whilst the <em>values</em> are managed by the business.
Data Controller is available for all 3 SAS platforms (SAS Meta, SAS Viya, Base SAS). Like what you see? Don't hesitate to [book a demo session](https://datacontroller.io) and meet the team!
### 2 - Import Excel using SAS Studio
<img class="alignright" src="/wp-content/uploads/2021/04/2IrsV7v.png" alt="Upload File to SAS Studio" width="352" height="442" />
This approach can be used whether you are using SAS Meta, SAS Viya, or even Base SAS (eg with University Edition). Just open [YOURSERVER]/SASStudio in a browser (on Viya, it will be /SASStudio<strong>V</strong>).
The first step will be to get the Excel file to a location where it can be accessed by SAS.
On Viya, that will require opening the Explorer menu, expanding the server dropdown, and right clicking on the directory within. You can then choose the "Upload files" option to import your spreadsheet. Your upload is limited to 100mb (default system setting), and the target directory is typically the unix home directory for your user.
The "home directory" part is a key point - as it means that other users will not necessarily be able to access that source file. To easily load to other areas on the SAS filesystem you may need to ask your admin to create a symlink, or use an alternative upload mechanism (such as [this one](https://sasjs.io/apps/#viya-file-uploader)).
<img class="alignleft" src="/wp-content/uploads/2021/04/2021-04-10_15-53.png" alt="Import Excel to SAS Studio" width="218" height="203" />
Once your file is available on the server, you can begin the import process. Simply click on the Start Page, "New Import" and follow the steps in the wizard.
The generated SAS code will be shown in the window below, and the output can be directed to either to WORK or a permanent library as desired.
You may need to rename the default target dataset (eg from IMPORT to IMPORT2) in order to run the code.
### 3 - How to Import Excel using the SAS Add-In for Microsoft Office
Unlike the other examples presented, this one allows you to load data from directly within your Excel workbook! You must have the requisite permission to make data write-back to the target table selected. You can also modify column properties and specify an 'inactivity timeout' before 'edit mode' is closed.
To use, simply open your desired table and click the "Begin Edit" button in the SAS Ribbon.
`youtube: https://www.youtube.com/watch?v=XrK3hmYO4ag`
The SAS Add-In for Microsoft Office is available only for SAS Meta deployments.
### 4 - Import Excel using the SAS Enterprise Guide Wizard
<img class="alignright" src="/wp-content/uploads/2021/04/import_excel_into_EG_03.jpg" alt="SAS Enterprise Guide" width="500" height="284" />
To import a spreadsheet using Enterprise Guide you can simply click "File" then "Import Data" and select your Excel file to proceed through the wizard. An excellent guide to this process is available [here](https://bi-notes.com/sas-enterprise-guide-import-excel/). This process will load an Excel table into your SAS project, where you can run further analyses.
This approach will work for both SAS Meta and Base SAS deployments, the key difference being that for SAS Meta your tables will be on the SAS Server as opposed to the local desktop.
### 5 - Import Excel using the SAS Data Integration Studio
This option is more for SAS 9 ETL developers building enterprise data flows from stable data sources. Many ETL teams are forced to build flows Excel, despite it's (deserved) reputation as an "unstable data source".
One way to perform this task is to set up a library using the EXCEL engine, then register the tables within it. This involves a number of steps, the screenshots for which are below!
<img class="aligncenter" src="/wp-content/uploads/2021/04/Excel-in-DI-Studio.png" alt="Import Excel to SAS DI Studio" width="683" height="984" />
## Comparison of Methods
With so many methods, how do you choose the one that is right for you? This depends on the volume, velocity, variety, and <em>purpose</em> of the data you are loading. If your Excel is large, has a static structure, and arrives directly from a source system on a regular basis in a fixed location, then you would probably want to build an automated flow using Data Integration Studio. For ad-hoc data, prepared by technical analysts for departmental reporting then either Enterprise Guide, SAS Studio or the SAS Addin are potential choices.
For <em>business</em>-<em>sourced</em> data (such as model parameter sets, reference data, actuarial assumptions) that need to be updated in an <em>IT-secured</em> environment then Data Controller is an<strong> ideal choice.</strong> Particularly given that it eliminates the need for a shared directory and reduces the risk of downstream batch incidents due to 'validate on load' features.
Data Controller works well as a zero-code option for Excel imports, in the following scenarios:
- The extraction process must be dynamic, as data can sometimes have additional columns or differently named worksheets
- You need to upload data rapidly and don't have time for a fully automated ETL solution to be built &amp; deployed
- You must retain the original Excel, along with change metadata, for audit purposes
- Your SAS Admin does not have capacity for ad-hoc data modification requests
- You would like to separate the role of Data Submitter and Data Approver
- Your data model needs protecting from accidental corruption
- You need automatic Data Quality rules applied at source
Below is a further comparison of the different options:
<img class="aligncenter size-full" src="/wp-content/uploads/2021/04/dcgrid-1.png" alt="Data Controller compared" width="558" height="436" />
If you'd like to discuss potential use cases for Data Controller, or to get a deep dive into any of it's features, you can begin the process right now by requesting a [demo session](https://datacontroller.io/contact)!

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@ -0,0 +1,58 @@
---
title: Allianz Insurance and Data Controller for SAS®
description: Data Controller for SAS is used to provide the actuarial team at Allianz with an easy to use bitemporal reporting capability
date: '2021-03-26 11:20:56'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './Allianz_logo.svg_.png'
tags:
- Use Cases
---
Data Controller for SAS was deployed at Allianz to support a bitemporal reporting capability for the actuarial team. [Bitemporal](https://datacontroller.io/bitemporal-historisation-and-the-sas-dds/) tables allow data to be stored across two time dimensions - a business (reporting) history and a technical (version) history. This enables results for any previous reporting period to be reproduced 'as at' any point in time, providing full auditability and traceability. By using Data Controller, the actuarial team at Allianz were able to upload data from multiple sources, and have it automatically merged into their SAS reporting database. We caught up with Joris Jansen, Actuary for P&amp;C Reserving.
---
<img class="alignright" src="/wp-content/uploads/2021/03/view.jpeg" alt="Data Controller for SAS at Allianz Insurance" width="300" height="169" />
### Joris, hallo! Could you please explain to us, your role within Allianz?
My role, Allan, is a "reporting role" in the business of the Allianz Benelux insurance company. This means that every month I need to report all the figures to my management in an accurate, complete and timely way - and this is important for Allianz Management in order to steer the company in the right way. I'm responsible for ensuring the reporting processes are performed properly, and of course, that the output is correct. The Data Controller is helping me to do this.
### Great! And how does it help? What do you use Data Controller for?
The Data Controller I use for the upload of the source data for my calculations. You can imagine I have source data from let's say, different systems or queries or whatever, and I upload this with the Data Controller. I then have the data available and can perform my calculations on the data in SAS, and output the figures for reporting.
### And why is it that you use Data Controller and not any other tools for uploading that data?
Well I think that you, Allan, have given Allianz a really easy tool to use, to do this part in a uniform, simple and proper way. Also the sources are from different systems so the unification of those different data sources isn't that easy - and Data Controller is a really simple tool to make that happen. I have my inputs from different systems available and in a relatively easy way I can upload them.
Automation may be much better but it's not that easy and practical to do.
### I see. And what format do these files arrive in?
The inputs are all Excel or CSV files and it's easy to input them.
### What happens to this data after it is uploaded?
There is a [hook script](https://docs.datacontroller.io/dcc-tables/#pre_edit_hook) that runs after the upload that makes sure that all the data is captured in one big database, which is a convenient way to handle the data so it's all together in one place. Before Data Controller, this wasn't the case. The advantage of having this data all merged automatically with each upload means that our subsequent processes that calculate the amounts as an output from the source is more uniform. Where we previously had many programs and input locations that would make calculations on top of one another, there are now just 1 to 3 programs.
### Is anyone else making use of Data Controller?
Currently there are three heavy users, and also other departments make use of it, eg the audit department, concerned with models and calculations, will make checks to ensure it works properly on a regular basis.
### What would you say are your favourite features in Data Controller, and why?
The most important feature is the uniform way in which data is uploaded and merged automatically in one database. Which means we always have our data, in an accurate and complete way, available. It is not that easy to adjust the database, and before Data Controller was in place it was possible to accidentally modify data because it was just sitting on the directory, the network drive. It is now not so easy to do that, the data is uploaded in a uniform way and always protected. We have a better, more proper way of reporting.
The second is that if auditors have extra questions, or would like to investigate other scenarios for calculating the outputs, then we can re-perform very easily the calculation because the data is already available [* with bi-temporal history in a database]. We don't need to do something extra, we can recalculate whenever we want. It's not something we use regularly but it is there, if there are questions from audit or if we would like to change something on the longer term etc, for change management purposes. This is a very nice feature.
### Glad to hear it! Is there anything you'd like to see in DC?
I'm currently very satisfied, it works, it's very fast, it's really quick to use. Which makes it very efficient.
### Erg bedankt!
---
The previous article in this series is available [here](/siemens-healthineers-smart-data-catalog/).

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

File diff suppressed because one or more lines are too long

Binary file not shown.

After

Width:  |  Height:  |  Size: 26 KiB

View File

@ -0,0 +1,40 @@
---
title: Data Controller - a Developer Perspective
description: Rafal Gagor - veteran SAS Developer - shares his thoughts and experience of using Data Controller for SAS® on a client project.
date: '2020-07-29 09:11:59'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './IMG-20190430-WA0049.jpg'
tags:
- Data Lineage
- Metadata
- Release Management
- SAS
- Use Cases
---
<h1>What problem does Data Controller for SAS® solve?</h1>
<div class="imgHolder alignright"><a href="https://www.linkedin.com/in/rgagor/"><img src="/wp-content/uploads/2020/07/IMG-20190430-WA0049.jpg" alt="Rafal Gagor - Veteran SAS Developer" width="180" height="180" /></a><div><span>Rafal Gagor - Veteran SAS Developer</span></div></div>
It's a question we get asked a lot, and so this is the first of a series of articles that explore real users and their actual use cases. We caught up with [Rafal Gagor](https://www.linkedin.com/in/rgagor/), a DI Developer with 2 decades of SAS and Financial Services experience, to get his impressions after using Data Controller for SAS on a client project.
## So, Rafal - what did your Client use Data Controller for?
Data Controller was implemented initially as the backbone of a [SASjs](https://sasjs.io) Release Management system - it allowed my colleagues and I to upload, for each promote, a list of affected SAS artefacts along with details of the release, and associated JIRA tickets. We could make changes directly via the web interface, or by uploading an Excel file. It was great to capture that information automatically in a database and have data quality rules applied at source. The resultant "clean" data enabled the delivery of a robust release management web application that saved hours of manual effort each week.
## Nice use case. How did you manage before you had Data Controller?
Previously, release management was a process performed manually and inconsistently, with data scattered across dozens of Excel and Word documents - it was not brought into SAS at all. In the case of other, regular, business-sourced tables that needed to be uploaded - the options were to either hand-craft an upload process manually as a "one off" using Enterprise Guide, or to build (and deploy) an ETL flow sourced from an Excel or a CSV file deployed to a network drive.
This option was problematic - how frequently to run the flow? What if the file format changed? What if the target table changed? It was therefore quite convenient to have the ability to hand such processes back to the data owner, who could safely modify the data within Data Controller without running the risk of overwriting any indexes or otherwise changing the schema of the table.
## Last question. What were your favourite Data Controller features?
Probably my favourite feature was the **Metadata Navigator** - I hadn't been able to use this since moving away from Base SAS quite some years ago. It was useful to be able to navigate through the objects and associations, and view the properties and attributes, without writing any code. Next up was the **Data Lineage** explorer.
When the business told me there was an issue with a particular field, it was really helpful to use the Data Controller graphical tools - at both table and column level - to perform a reverse (Target to Source) lineage diagram and quickly understand the data flow. This avoided the need to open up every job in DI Studio and explore the transforms.
Although it's a basic feature, it was great to use the **Data Viewer** to quickly examine and explore the raw tables without locking the datasets (and hence running the risk of stopping a batch run). The full-table search was a neat touch, as well as the DDL export option. Finally, I liked the fact that there were separate buttons for SUBMIT and APPROVE - a bit like a database where you have to commit the change. It's a nice approach that gives an extra layer of validation for the changes uploaded.
## Rafal - many thanks!

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

View File

@ -0,0 +1,24 @@
---
title: Data Quality and the NBB_2017_27 Circular
description: The 3 Principles of NBB_2017_27 require significant documentation. Data Controller reduces costs of compliance and improves Timeliness of data.
date: '2018-10-13 21:52:06'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './download.png'
tags:
- Basel III
- Data Quality
- Data Quality Framework
- NBB
- NBB_2017_27
- Regulatory
- SAS
- Solvency II
---
When applying financial regulations in the EU (such as Solvency II, Basel III or GDPR) it is common for Member States to maintain or introduce national provisions to further specify how such rules might be applied. The National Bank of Belgium (NBB) is no stranger to this, and releases a steady stream of circulars via their <a href="https://www.nbb.be/en/financial-oversight/general/news/circulars-and-communications">website</a>. The <a href="https://www.nbb.be/doc/cp/eng/2017/20171012_nbb_2017_27.pdf">circular</a> of 12th October 2017 (NBB_2017_27, Jan Smets) is particularly interesting as it lays out a number of concrete recommendations for Belgian financial institutions with regard to Data Quality - and stated that these should be applied to internal reporting processes as well as the prudential data submitted. This fact is well known by affected industry participants, who have already performed a self assessment for YE2017 and reviewed documentation expectations as part of the HY2018 submission. <h2>Quality of External Data</h2> The DQ requirements for reporting are described by the 6 <a href="https://www.nbb.be/doc/cp/eng/2017/20171012_nbb_2017_27_annex.pdf">dimensions</a> (Accuracy, Reliability, Completeness, Consistency, Plausibility, Timeliness), as well as the Data Quality Framework described by Patrick Hogan <a href="https://www.bankingsupervision.europa.eu/press/conferences/sup_rep_conf/shared/pdf/Item4_1_PatrickHogan.pdf">here</a> and <a href="https://www.bankingsupervision.europa.eu/press/conferences/sup_rep_conf/shared/pdf/2017/Data_quality_framework_tools_and_products.pdf">here</a>. There are a number of 'hard checks' implemented in OneGate as part of the XBRL submissions, which are kept up to date <a href="http://www.eba.europa.eu/risk-analysis-and-data/reporting-frameworks">here</a>. However, OneGate cannot be used as a validation tool - the regulators will be monitoring the <strong>reliability</strong> of submissions by comparing the magnitude of change between resubmissions! Not to mention the data <strong>plausibility</strong> (changes in submitted values over time). <h2>Data Quality Culture</h2> When it comes to internal processes, CRO's across Belgium must now demonstrate to accredited statutory auditors that they satisfy the 3 Principles of the circular (Governance, Technical Capacities, Process). A long list of action points are detailed - it's clear that a <em>lot</em> of documentation will be required to fulfil these obligations! And not only that - the documentation will need to be continually updated and maintained. It's fair to say that automated solutions have the potential to provide significant time &amp; cost savings in this regard. <h2>Data Controller for SAS®</h2> The Data Controller is a web based solution for capturing data from users. Data Quality is applied at source, changes are routed through an approval process before being applied, and all updates are captured for subsequent audit. The tool provides evidence of compliance with NBB_2017_27 in the following ways: <h4>Separation of Roles for Data Preparation and Validation (principle 1.2)</h4> Data Controller differentiates between Editors (who provide the data) and Approvers (who sign it off). Editors stage data via the web interface, or by direct file upload. Approvers are then shown the new, changed, or deleted records - and can accept or reject the update. <a href="/wp-content/uploads/2018/10/Screen-Shot-2018-10-13-at-22.50.56.png"><img class="aligncenter wp-image-962" src="/wp-content/uploads/2018/10/Screen-Shot-2018-10-13-at-22.50.56.png" alt="" width="553" height="296" /></a>
<h4>Capacities established should ensure compliance in times of stress (principle 2.1)</h4>
As an Enterprise tool, the Data Controller is as scalable and resilient as your existing SAS platform.
<h4>Capture of Errors and Inconsistencies (principle 2.2)</h4> Data Controller has a number of features to ensure timely detection of Data Quality issues at source (such as cell validation, post edit hook scripts, duplicate removals, rejection of data with missing columns, etc etc). Where errors do make it into the system, a full history is kept (logs, copies of files etc) for all uploads and approvals. Emails of such errors can be configured for follow up. <h4>Tools and Techniques for Information Management Should be Automated (principle 2.3)</h4> The Data Controller can be configured to execute specific .sas programs after data validation. This enables the development of a secure and <em>integrated</em> workflow, and helps companies to avoid the additional documentation penalties associated with "miscellaneous unconnected computer applications" and manual information processing. <a href="/wp-content/uploads/2018/10/Screen-Shot-2018-10-13-at-22.53.38.png"><img class="aligncenter wp-image-963" src="/wp-content/uploads/2018/10/Screen-Shot-2018-10-13-at-22.53.38.png" alt="" width="278" height="128" /></a> &nbsp; <h4>Periodic Review &amp; Improvements (principles 2.4 and 3.4)</h4> The Data Controller is actively maintained with the specific aim to reduce the cost of compliance with regulations such as NBB_2017_27. Our <a href="https://slides.com/allanbowe/datacontroller/#/">roadmap</a> includes new features such as pre-canned reports, version 'signoff', and the ability to reinstate previous versions of data. <h4>A process for correction and final validation of reporting before submission (3.1)</h4> As a primary and dedicated tool for data corrections, Data Controller can be described once and used everywhere. <h4>List of Divisions Involved in Preparing Tables (principle 3.2)</h4> By using the Data Controller in combination with knowledge of data lineage (eg from SAS metadata or manual lookup table) it becomes possible to produce an automated report to identify exactly who - and hence which division - was involved in both the preparation and the validation of the all source data per reporting table for each reporting cycle. <h4>Processes should integrate and document key controls (principle 3.3)</h4> Data Controller can be used as a staging point for verifying the quality of data, eg when data from one department must be passed to another department for processing. The user access policy will be as per the existing policy for your SAS environment. <h2>Summary</h2> Whilst the circular provides valuable clarity on the expectations of the NBB, there are significant costs involved to prepare for, and maintain, compliance with the guidance. This is especially the case where reporting processes are disparate, and make use of disconnected EUCs and manual processes. The Data Controller for SAS® addresses and automates a number of pain points as specifically described in the circular. It is a robust and easy-to-use tool, actively maintained and <a href="http://docs.datacontroller.io">documented</a>, and provides an integrated solution on a tried and trusted platform for data management. &nbsp;

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

View File

@ -0,0 +1,16 @@
---
title: Data Controller - a BICC perspective
description: Data Controller for SAS was recently implemented within DER Touristik, Germany. We caught up with Herbert Grossmann of the BICC to learn more about it.
date: '2020-08-10 20:43:45'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './Group-1dt.png'
tags:
- Data Quality
- DER Touristik
- Enterprise Guide
- SAS
- Use Cases
---
We caught up with <a href="https://www.linkedin.com/in/herbert-gro%C3%9Fmann-53690517a/">Herbert Grossmann</a> of DER Touristik to understand how Data Controller for SAS is used within the BICC and the types of challenges it solves. <a href="https://www.linkedin.com/in/herbert-gro%C3%9Fmann-53690517a/"><img class=" wp-image-1137 alignright" src="/wp-content/uploads/2020/08/0-1.jpeg" alt="" width="183" height="183" /></a> The previous article in this series can be found <a href="/data-controller-developer-perspective/">here</a>. <h2>Guten Tag, Herby! Can you tell us about your role within DER Touristik?</h2> Yes, I am working here as project manager for BI and Analytics and my department is the BICC (Business Intelligence Competence Centre), and we have an absolute focus on the SAS technology stack - so thats my daily business. <h2>Great. And, I understand you guys are using Data Controller for SAS. What do you use it for?</h2> Well, mainly for managing control tables, that we have a lot of nowadays, in the data warehouse. But we also implemented what we call an "early bird booking system". There we have used the Approval process within Data Controller, which is excellent, because users, business departments etc, can approve data that would normally only be accessible within the back-end warehouse itself. So now they have an interface, which limits their access to specific views, and this is very useful - it was also highly commended by our management. <h2>So, business users can approve modifications to secure warehouse tables without having direct write-access themselves?</h2> Exactly <h2>Fantastic. Next question. How does having Data Controller make your life easier?</h2> Well - there is the version control of course, that gives us a much better traceability of changes to see what was changed by whom, at what time. And we have the immediate constraint checking which is also very useful because some of the tables are sensitive towards, lets say, the changes of the primary key. And in the past when we did it the "old fashioned way" it was possible that by mistake that someone could cause duplicate primary keys or stuff like that, so this is now not possible anymore, which is very good. And like the example that I mentioned before, that now we can grant access to certain sensitive tables even for business users that would normally have no access, but we can decide whether to give them at least the right to <em>view</em> these tables, or during special events <em>edit</em> tables, or approve edits of those tables. So this gives a lot of opportunities, and makes it much easier than it was in the past. <h2>Nice! And so, talking about the past, before you had Data Controller, how did you manage modifications to data in SAS?</h2> We classically used two approaches - on one hand using SAS Enterprise Guide to directly edit tables or do imports, such as imports of excel sheets for example. On the other hand, we have some batch processes that also do imports of Excel tables or CSV tables. So those were the classic and standard ways. And of course especially the batch one we are still using for some files, depending on the situation. But we do no editing of tables directly with Enterprise Guide anymore because it is much safer and easier to use the Data Controller. <h2>Understood. So on the Data Controller side, what would you say were your favourite features and why?</h2> I would say that I like the editor as a whole very much. I think that is great that in the moment you make a table editable, you can define the ways in which you would edit the tables. Like whether there is some historic logging or not, and the fact you can set the constraints. And in the editor then you have a lot of Data Quality opportunities such as defining drop-down lists for certain attributes, which really makes editing the tables easier and much more comfortable. It was a little bit of a pain in the past but now its almost fun. <h2>That's great feedback! Is there anything else, any comments you would like to add?</h2> Yes, I like the fact that Data Controller is really just a part of the SAS environment. Its not a completely separate application that you have to install somewhere, but a kind of pluggable part of the SAS environment. I liked it very much because then you still have everything in your hands. I mean I am not a developer but my knowledge of SAS is already enough to match the criteria to be able to handle the Data Controller as whole, to even do the updates and/or to modify things. And also its easy to show others who have experience with SAS how the tool works and what is to be done when there are data issues. And yeah, I think thats a big advantage. <img class="wp-image-1140 aligncenter" src="/wp-content/uploads/2020/08/Group-1dt-1-e1597092362693.png" alt="SAS DER Touristik" width="242" height="213" />

Binary file not shown.

After

Width:  |  Height:  |  Size: 33 KiB

View File

@ -0,0 +1,65 @@
---
title: EUC Management Systems need these 12 Attributes
description: An EUC management system should automatically identify, clean, secure, backup, and integrate EUC data with full auditability, ownership, and approval.
date: '2018-10-30 09:13:25'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './DC-UML-Activity-Diagram-2.png'
tags:
- Data Management
- End User Computing
- EUC
- Excel
- SAS
- Shadow IT
- VBA
---
End User Computing (EUC) applications are unavoidable - the challenge is not to erase them, but to embrace automated approaches to EUC management that will identify, clean, secure, backup, and integrate EUC data with full auditability, ownership, and approval.
<h2>The Much-Maligned EUC</h2>
EUC applications such as Excel, Access Databases, and locally executed programs, are often targeted as the source of a myriad of risks - such as financial misstatements, internal fraud, incorrect models, and potential for business process disruption. The rationale being that business developed / owned applications are not subject to the same access controls, development &amp; testing standards, documentation and release management processes as can be found over the "IT Fence". Whilst this is probably true, the inherent flexibility of EUCs that can be quickly updated without service desk requests, project codes, or lost arms &amp; legs - means that EUCs are, regardless, here to stay.
The challenge is to find a way to shine a light onto this "Shadow IT", and provide a framework by which EUC data can be extracted in a simple, safe, secure, scalable, and auditable fashion. <a href="/wp-content/uploads/2018/10/DC-UML-Use-Case-Diagram-EUC.png"><img class="aligncenter size-large wp-image-1008" src="/wp-content/uploads/2018/10/DC-UML-Use-Case-Diagram-EUC.png" alt="EUC Use Case Diagram" /></a>
<h2>EUCs can be Controlled</h2>
The 'war on EUCs' cannot be won - it simply isn't practical to ban them, or to migrate / redevelop every closely held and highly complex legacy VBA application. Until alternative solutions for Citizen Developers to build Enterprise Apps (such as <a href="https://sasjs.io">SASjs</a>) become mainstream, simple measures / controls on the EUCs themselves must be implemented - such as version control, readonly attributes, embedded documentation, peer review etc. In the meantime, a management system for EUCs is the ideal place for capturing the requisite metadata needed to monitor, audit, and secure the data therein. Such a management system should have, as a minimum, the following attributes:
<h3>EUC Data Quality at Source</h3>
The ability to run data quality routines at the point of data upload (from EUC to secure IT environment) provides instant feedback to EUC operators that will allow them to make corrections and avoid costly post-upload investigations, re-runs, or worse - incorrect results. As part of this process, it should be easy to create and update those Data Quality rules. A longer discussion of Data Quality can be found <a href="https://www.linkedin.com/pulse/zen-art-data-quality-allan-bowe/">here</a>.
<h3>EUC Data Review (4 eyes)</h3>
After EUC data is submitted, it should be reviewed before the target database is updated. It should be possible (but not mandatory) for this check to be performed by a different individual. When performing that check, it should only be necessary to review new / changed / deleted records. For changed records, the reviewer should also be able to see the original values. If the data is approved, the target table is updated. If rejected, the staged data can simply be archived.
<h3>Roles &amp; Responsibilities (RACI)</h3>
By capturing who is actually submitting the data, we can see who is responsible for each EUC. By reviewing who is signing off on that data, we have an indication of who is accountable. And by seeing who is being notified of changes to that data, we can deduce who are being consulted / informed. It will then be unnecessary to conduct time-consuming interviews or audits to produce instantly out of date and error-prone EUC ownership documentation!
<h3>EUC Data Security</h3>
EUCs are often present on network shares, with opaque access policies and few (if any) controls to prevent unintentional deletion or corruption of data. An EUC management system should ensure data protection from the point of EUC integration right through to the loading of the data to the target table(s). End users should not require write access to the target databases! Neither should individuals in IT be regularly relied upon to run manual scripts for loading business critical data. Finally, it should be possible to restrict (at both column and row level) which groups are given permission to edit or approve data.
<h3>Ease of Use</h3>
Adding new tables / EUCs to the system should be a BAU (configuration) task, and possible without needing to secure IT development resource. The process should be so well defined, that new EUC operators can safely integrate their processes with minimum (if any) engagement from IT.
<h3>EUC Traceability</h3>
Understanding the flow of data into regulatory reports is essential for ensuring the accuracy of the figures they contain. Whilst this can be done automatically in some IT systems (eg SAS Metadata or Prophet Diagram View) the lineage breaks down when data flow crosses system borders. An EUC management system therefore should keep a full history to enable traceback of data items, right back to a copy of the EUC from where the data arrived.
<h3>EUC Data Integration</h3>
Any "system" worth it's salt will enable easy integration and flexible workflows to ensure that subsequent processes can be triggered on relevant events (such as EUC submission, or data approval). There should be no manual steps other than the act of submitting the data, and reviewing / approving the data.
<h3>Version control / automated testing</h3>
This should really go without saying, however the reality is that there are still many teams (yes, even in IT) who work without source control. Don't even think about building a complex data management system without solid source control and a comprehensive test harness. Not to mention automated build and deployment. When it comes to a system that is responsible for maintenance of business data, it is imperative that it is robust, performant, and filled with checks and controls.
<h3>Documentation</h3>
Whilst a decent system should be intuitive enough to operate without a manual, when it comes to maintaining, extending, or using advanced features - documentation is essential, and should be updated regularly. New feature? Write the test, make the fix, build &amp; deploy, pass the test, update the documentation, release. Documentation should be useful for users, developers, and administrators - with diagrams, screenshots, and process flows.
<h3>Scalability</h3>
During month end, temperatures are high and the pressure is on. The last thing you need on BD2 is system failure, especially when it's 4:30 on a Friday and 150 users are affected. Be sure your platform of choice is proven, supported, and highly available.
<h3>EUC Auditability</h3>
One of the biggest business benefits of an EUC Management System is the ability to trace data directly back to a locked down copy of the EUC that it came from. The system should therefore make it easy to identify and locate that copy, to see who submitted it, who signed it off, and what the precise changes were (adds, updates, deletes). <a href="/wp-content/uploads/2018/10/DC-UML-Deployment-Diagram-without-EUC-EUC-version.png">
<img class="aligncenter wp-image-1055 size-large" src="/wp-content/uploads/2018/10/DC-UML-Deployment-Diagram-without-EUC-EUC-version.png" alt="" /></a>
## Data Controller for EUC Management
Before you go ahead and build / maintain your own black box bespoke EUC reporting solution, take a look at what the Data Controller has to offer (in addition to everything described above):
- Ability to run bespoke SAS programs before / after every edit or approve
- Easy / simple deployment (entirely within your existing SAS platform)
- Roadmap (version restore, data access reports, data profiling)
- A smooth and performant review and approve experience
- A proven methodology for EUC capture
- Extensive [documentation](https://docs.datacontroller.io)
- Free Community Edition
- [Formula Support](https://docs.datacontroller.io/excel)
- Secured by SAS
We can also provide an on-site consultant to perform the deployment and user training. [Get in touch](/contact) to learn more!
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/QhShWNnNjIw" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

View File

@ -0,0 +1,143 @@
---
title: ROI and Payback
description: How much time & money could you save by implementing Data Controller? We help you calculate the ROI and Payback time of your software investment.
date: '2021-07-15 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe
previewImg: './roi.jpeg'
tags:
- ROI
- Payback
- Regulatory
---
# ROI and Payback of Data Controller
For some customers, the evaluation of a tool is less about the additional value it provides, and more about - will this save me time (and therefore, €€€)?
By quantifying these savings, it is possible to calculate the rate of return of your investment in Data Controller (ROI) and the Payback period - how long it will take before that investment is effectively recuperated.
To assist with this process, we will explain all the areas where Data Controller can save time - and finish with a calculator you can use to build your business case internally.
## #1 - Development Time
This represents the time spent designing and preparing the SAS code (or DI Job) that will take a business input and load it into an existing table - be that a database, a dataset, or in-memory CAS. It could be loaded with a `proc import`, or an `extract` transform, or even a full-blown bespoke web application for capturing the particular input requirement.
It also includes the time spent unit testing that code, documenting any macros, and parameterising it accordingly for the particular input (eg, the network share in which the input file will land). This time could be spent by multiple stakeholders. To summarise:
* Preparing requirements
* Solution design
* Allocating the development work
* Preparing test data
* Preparing the environment
* Doing the actual development
* Documenting the result
* Code Review
* Writing test Cases
* Running the tests
With Data Controller, this time is reduced to **zero**. By taking metadata from the target table (columns, lengths, types, attributes such as NOT NULL) a grid is displayed dynamically into which end users can safely make changes, or drop files such as Excel or CSV with **zero code** (the data is extracted dynamically by JavaScript).
## #2 - Deployment Time
This represents the time taken to move jobs and programs from dev, through other SAS environments such as test, acceptance, and production. As part of this, it's often necessary to produce release documentation, perform additional deployment steps (such as setup of landing areas, permissions), prepare backout scripts, and perhaps even attend a Change Management meeting to explain the upcoming updates.
With Data Controller, once installed - this part is reduced to zero. Unless there was a need to configure a table to be editable in a test / accept environment, it wouldn't need to be done (and if it was, it would be a config change via the interface, not an actual code change).
## #3 Batch Incidents
Quite frequently, when capturing CSVs and Excel files from business users, there can be unintentional changes to the file format or data therein.
This can play havoc with the batch jobs used to build them, which typically expect a fixed structure, naming convention, directory path, and file type.
Failures in batch runs take time to troubleshoot and resolve, with knock-on impacts to downstream reporting teams.
Data Controller sidesteps the problem by ensuring that data is validated on arrival - ie, the user is unable to upload invalid data. At the same time, the process is flexible enough to ingest data with varying formats, so long as all the necessary columns are provided.
Batch incidents based on invalid files are therefore avoided.
## #4 Data Quality Issues
For various reasons, data captured regularly from business users, can one day fail to meet quality standards. This typically creates a whole bunch of work:
* Incident reporting the quality issue
* Creating a new rule for the quality issue
* Testing and deploying the new rule
* Reloading the original data
Data Controller drastically reduces the time spent on Data Quality with the following features:
* Automatic rules based on the target table schema
* Configurable frontend validations
* Simple and Complex dropdown rules
* Ability to run backend SAS programs for advanced DQ
In addition, corrections can be made immediately, 'in place', with an approval step and audit trail.
## #5 Compliance Costs
For many regulated clients, the costs of compliance (such as [Sarbanes Oxley](/sarbanes-oxley), BCBS, national [Data Quality regulations](/data-quality-and-the-nbb_2017_27-circular)) fall into 3 camps:
* Ongoing (day to day) costs
* Regular (eg annual) audit costs
* Fines (or the risk thereof)
In terms of data, such costs might come down to storing multiple copies of Excel EUCs on network drives, and the resultant technical debt (extra time) incurred in managing these as the copies mount up during a complex month-end process.
For audits, especially when performed by external companies, the time spent can be significant. For end-user computing systems (where source code is not secured) such audits must be _reperformed_ every time, which can get very expensive.
Examples of fines that have been dealt in the past due to Data Quality or Data Access issues include:
* Morgan Stanley (2020), [$5 million](https://www.cappitech.com/blog/morgan-stanley-fined-5m-for-swap-data-reporting-errors-as-cftc-looks-to-improve-data-quality)
* Citibank (2020) [$400 million](https://occ.treas.gov/news-issuances/news-releases/2020/nr-occ-2020-132.html)
* DTCC (2021) [£350k](https://www.msn.com/en-gb/money/other/eu-securities-watchdog-slaps-dtcc-s-derivatives-unit-in-the-city-with-350k-fine-for-negligence/ar-AAM3u06)
The benefits of Data Controller in these areas are also threefold:
* Reduced ongoing cost of operation (spreadsheets backed up securely with each dataload)
* Reduce the cost of recertification with clear, controlled on-ramps from EUC to SAS
* Reduced risk of fines through a well documented, IT controlled, Data Governance process
Unlike desktop based solutions (such as Enterprise Guide), Data Controller secures all code and business logic at the backend in a centralised location - which is far more secure, auditable, and maintainable then the use of local network drives.
## #6 Data Lineage
For SAS customers using Data Integration Studio, a wealth of data lineage is available that maps source systems to target tables and vice versa.
To surface that information, it is typically necessary to make a request to an ETL developer (with DI Studio), or to step through a large number of connectors in SAS Lineage.
Data Controller provides both FORWARD and REVERSE lineage diagrams, available directly to all SAS users, that can be exported in PNG, SVG, and CSV formats.
## #7 Dataset Locks
Where end-users are using desktop tools to connect to SAS (eg Base SAS or Enterprise Guide) this can result in table locks preventing updates by other SAS users.
By using the VIEW menu in Data Controller to examine tables, no locks are held, and hence no processes are disrupted. In addition, it is possible to share links to tables, even filtered views of those tables.
## #8 Additional Value
Data Controller ships with dozens of [features](https://docs.datacontroller.io/#product-features) that help with Data Quality, Data Governance, and Data Management - such as:
* Data Catalog
* Data Dictionary
* Data Alerts
* Data Quality routines
* Data Loading routines
* DDL Exports
* User Navigator
* Metadata Navigator
* Data Model Change Tracking
We provide a section in the calculator for you to quantify the benefits/savings from having such features.
## ROI Calculator
Download our calculator, and see how much you could save by deploying Data Controller!
[
<button>
Download
</button>](/files/DC_ROI_PAYBACK.xlsx)

Binary file not shown.

After

Width:  |  Height:  |  Size: 712 KiB

View File

@ -0,0 +1,63 @@
---
title: SaasNow Partnership
description: The Data Controller for SAS® team is pleased to announce a new partnership! Customers can now obtain Data Controller for SAS® directly from SaasNow - a respected SAS Partner based in the Netherlands.
date: '2021-07-22 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe
previewImg: './logo.png'
tags:
- Partners
---
# SaasNow Partnership
The Data Controller for SAS® team is pleased to announce a new partnership! Customers can now obtain Data Controller for SAS® directly from SaasNow - a respected SAS Partner based in the Netherlands.
Purchasing Data Controller directly through a partner such as SaasNow provides several benefits:
### Rapid Installation
As a new SaasNow customer, your Data Controller installation may be as easy as clicking a button. If you have an existing partner relationship, you have a team ready with the knowledge & skills to deploy & configure Data Controller seamlessly into your SAS platform.
### Ease of Support
With a partner as your first line of support, you can leverage a single process for all SAS-related requests, including Data Controller. For complex, non-standard issues, partners have the full and immediate support of the Data Controller for SAS® team.
### Ease of Billing
No need to onboard a new vendor or deal with multiple invoices! You can manage Data Controller as part of a single partner agreement. There are also the following differences to the [standard pricing model](/pricing) when purchasing from SaasNow:
- Based on CPU core - so you have unlimited users
- Monthly - so you can flex your contract
## SaasNow is a Notilyze brand. More information about Notilyze
Notilyze makes data analytics available for every organization, regardless of industry or size. The Notiliyze value framework consists of three principal components:
### 1 - Notilyze Analytics as a Service
The Notilyze Analytics as a Service concept enables your organization to transform data into valuable insights and become data-driven without investing in hardware and software or employing analytics experts such as data scientists.
### 2 - Notilyze Solutions as a Service
Notilyze develops industry and application-specific solutions for:
- IoT device analytics
- Real Estate Analytics
- Logistics Optimization
- Credit check automation
- Predictive maintenance
- Debt Collections & Recoveries (CoRe42)
### 3 - SaasNow Cloud for SAS Viya
SaasNow enables you to leverage the power of SAS in the cloud without the constraints of infrastructure or skilled staff. With SaasNow, you can bring your SAS Viya licenses into the SaasNow cloud fast and scale cores, memory, or storage according to your needs.
Most SaasNow Viya cloud environments are provisioned and ready to use within 4 hours.
If you would like to provision Data Controller for SAS® as part of your engagement with SaasNow, contact [daniel@saasnow.com](mailto:daniel@saasnow.com).
_The SaaSNow version of this announcement is available [here](https://www.saasnow.com/news/data-controller-for-sas-now-available-on-saasnow/)_

Binary file not shown.

After

Width:  |  Height:  |  Size: 36 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 317 KiB

View File

@ -0,0 +1,56 @@
---
title: Sarbanes-Oxley and Data Controller for SAS©
description: Learn how Data Controller for SAS reduces the risks and compliance costs of Sarbanes-Oxley and associated PCAOB Accounting Standard 5.
date: '2020-08-12 01:00:21'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './Screenshot-from-2020-08-10-19-16-01.png'
tags:
- data controller
- Data Lineage
- pcaob
- Regulatory
- sarbanes-oxley
- SAS
- sox
---
The Sarbanes-Oxley Act of 2002 has applied to all publicly-traded companies doing business in the US since 2006. The penalties can be severe - if Uncle Sam considers a corporate officer to have deliberately submitted an inaccurate certification, the corporate fine is $5 million with up to twenty years in prison for the individual(s). Accidental mis-certification (or non-submission) is just $1 million and 10 years in prison.
There are many aspects to full Sarbanes-Oxley (SOX) compliance, the [legislation](https://www.govinfo.gov/content/pkg/BILLS-107hr3763enr/pdf/BILLS-107hr3763enr.pdf) is over 60 pages long. As with other regulatory obligations, the goal is to regularly provide enough evidence to satisfy the auditor that the requirements have been met. As anyone running a compliance team knows, this is no small endeavour. The ability to automate the generation of such evidence, or make it available automatically to auditors, can result in significant cost savings. This article breaks down the areas where Data Controller can contribute to satisfying the requirements of the Sarbanes-Oxley Act.
## Sarbanes-Oxley Act Section 404 - MANAGEMENT ASSESSMENT OF INTERNAL CONTROLS.
Data Controller facilitates internal controls through a 4 eyes review &amp; approve mechanism for data changes. This, combined with data validation and an integrated workflow feature, provides a mechanism to easily track and report on the number of internal controls (quality rules, signoffs, rejections), as well as the frequency they are applied, who is applying them, which data items the controls relate to, and who is performing them. Such metrics can be compared and contrasted with pre-existing and current quality measures to help determine control effectiveness. Variations in the number of submit / approve cycles between reporting teams, also provide objective and repeatable measurements to support the assessment of the effectiveness of internal controls.
<div class="imgHolder"><a href="https://www.govinfo.gov/content/pkg/BILLS-107hr3763enr/pdf/BILLS-107hr3763enr.pdf"><img class="wp-image-1105 size-full aligncenter" title="Sec 404. (Sarbanes-Oxley)" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-07-17-57-01.png" alt="Sarbanes Oxley"/></a><caption>Sarbanes Oxley</caption></div>
&nbsp; Section 404 is widely considered the most onerous part of Sarbanes-Oxley, as the documentation and testing of all the controls requires significant time and effort. To address this, the <a href="https://pcaobus.org/">Public Company Accounting Oversight Board</a> (PCAOB - a US non-profit created by the Sarbanes-Oxley act itself) released<a href="https://pcaobus.org/Rulemaking/Docket%20021/2007-06-12_Release_No_2007-005A.pdf"> additional guidance</a> to assist management and auditors in producing their reports. This is officially labeled "Auditing Standard No. 5 - <em>An Audit of Internal Control Over Financial Reporting That Is Integrated with An Audit of Financial Statements"</em> A few points are highlighted by the guidance in this standard that are pertinent to users of Data Controller. <h2>PCAOB AS5 Sec24 - Controls Over Management Override</h2> Management Overrides (the freedom to simply "replace" reporting figures based on, presumably, sound judgement) are entity level controls that can be easily captured (in a centralised manner) by Data Controller. This in fact, is the "core functionality" of the tool. Data Stewards / Data Processors (Editors) make the change, then one or more Data Owners / Data Controllers (Approvers) sign it off before it is applied to the target table. A copy of the original excel file (if used) and a record of who made the change, when, what the change was, and why (if a reason is provided) is recorded. <a href="https://docs.datacontroller.io/dcc-validations/">Data Validation</a> rules can also be defined to ensure that inputs fit the desired pattern(s). <a href="https://pcaobus.org/Rulemaking/Docket%20021/2007-06-12_Release_No_2007-005A.pdf"><img class="aligncenter wp-image-1122" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-10-10-41-12.png" alt="Sarbanes Oxley sas management overrides" width="887" height="409" /></a> For fun, we made a short video for this part:
`youtube: https://youtu.be/iY3KQZL4ok0`
&nbsp; <h2>PCAOB AS5 Sec27 - Identifying Entity-Level Controls</h2> <img class="aligncenter wp-image-1126" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-10-12-58-26.png" alt="Sarbanes Oxley SAS Section 24" width="792" height="198" /> In the area of documenting the inputs, transformations and outputs of data flows within an organisation, SAS particularly shines, especially in the version 9 world. The table and column level lineage generated by SAS Data Integration provides a highly detailed view of the data lineage. Below is an example of Table level lineage, which colour codes each table according to it's library and captures the detail of each SAS job along the way. Clicking on a job will open the job in the metadata viewer. Clicking the table will open the table in VIEW mode. The lineage is shown all the way from source to target(s), or target to source(s) and can be exported in PNG, SVG, or CSV format.
<div class="imgHolder"><img class="aligncenter" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-10-14-41-04.png" alt="SAS Table Level Lineage Sarbanes Oxley"/><caption>SAS Table Level Lineage</caption></div>
Below is an example of column level lineage. Like Table Level lineage, this can be performed forwards or backwards and exported in multiple formats. Each arrow represents a SAS transform. Where business logic is applied, this is additionally extracted and showed in red.
<div class="imgHolder"><img class="aligncenter" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-10-18-42-50.png" alt="SAS Column Level Lineage Sarbanes Oxley"/><caption>SAS Column Level Lineage</caption></div>
&nbsp; The ability to define additional data lineages, outside of SAS (eg between spreadsheets or other reporting systems) is in the product roadmap, along with lineage from SAS Viya. <h2>PCAOB AS5 App B - Benchmarking of Automated Controls</h2> The use of IT secured financial controls can significantly reduce the cost of Sarbanes-Oxley compliance testing following the first year assessment, particularly where the source code is secured and cannot be modified by users. The core programs (services) within the Data Controller application that perform data signoffs are mature, distinct and change tracked - so it is possible for Data Controller to be upgraded in-place without affecting the benchmarking strategy. This contrasts with spreadsheet based control mechanisms, which must be revalidated in each reporting period.
<div class="imgHolder"><a href="https://pcaobus.org/Rulemaking/Docket%20021/2007-06-12_Release_No_2007-005A.pdf"><img class="aligncenter" title="PCAOB Release 2007-005A, Appendix B" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-08-22-15-50.png" alt="Sarbanes Oxley SAS"/></a><caption>PCAOB Release 2007-005A, Appendix B</caption></div>
## Sarbanes-Oxley Act Section 1102 - Tampering
Coming back to the original 2002 SOx paper, there is an additional stick being waved against those who destroy records. This is, unfortunately, a common occurrence in DWh landscapes - poorly designed data models often result in frequent rebuilds of monthly datamarts when issues are found. If your BI / ETL teams are routinely destroying / modifying database records as part of regular work efforts, you might wish to: a) ensure there is a well documented ticketing system to make sure those individuals are protected from any accusations, or b) implement a [Bitemporal](/bitemporal-historisation-and-the-sas-dds/) data model to ensure a full and transparent version history of data is always kept regardless of rebuilds. IT-secured tools such as Data Controller enable auditors to see easily for themselves who has changed a record, when, why, and who signed it off - thereby vastly reducing the potential for unintentionally impeding an investigation.
<div class="imgHolder"><a href="/wp-content/uploads/2020/08/BILLS-107hr3763enr.pdf"><img class="aligncenter size-full" title="SEC. 1102. (Sarbanes Oxley)" src="/wp-content/uploads/2020/08/Screenshot-from-2020-08-07-20-18-21.png" alt="sarbanes oxley SAS"/></a><caption>SEC. 1102. (Sarbanes Oxley)</caption></div>
## Sarbanes Oxley and SAS
We chose SAS as the platform on which to build Data Controller as it is very reliable, provides excellent support for data drivers (enables our code to run inside almost any database), long term customer support, and is very easy to deploy against. The demo version of Data Controller can be [deployed in under 30 seconds](https://docs.datacontroller.io/videos/#deploying-data-controller) (on a SAS 9 platform).
With SAS there are no additional servers to provision, firewalls to configure, scaling issues to address - everything works "out of the box". SAS also integrates nicely with existing enterprise authentication mechanisms such as LDAP, and the platform is typically fully secured under your existing IT policies at the backend.
Data Controller is built on [SASjs](https://sasjs.io) and hence we have versions for both SAS 9 and Viya. Do [get in touch](/contact/) to learn more.

Binary file not shown.

After

Width:  |  Height:  |  Size: 150 KiB

View File

@ -0,0 +1,61 @@
---
title: SAS Excellence in Innovation Award Finalist
description: Data Controller for SAS® was one of four global finalists in the 2021 Excellence in Innovation partner award!
date: '2021-05-22 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './innovation.png'
tags:
- Innovation
- Partners
---
Data Controller for SAS® was a finalist in the 2021 "Excellence in Innovation" award!
Each year, SAS recognises a partner that has demonstrated outstanding innovation by adding their own intellectual property to SAS software in order to uniquely solve a complex customer business problem. [Analytium](https://sasapps.io) is one of the "chosen four", from among over 1800 partners globally!
## Award Process
There was a strict 15 minute virtual presentation, that began with a 3-4 minute "customer elevator pitch" to explain the customer value of the solution. The remaining time delved into the technical side of things, plus Q&amp;A. There were around 12 SAS employees on the call, with a mix of backgrounds, and assessment was made across three dimensions:
- Does the solution represent unique innovation?
- Is the solution a valuable addition to SAS Software (ie, "fills a gap")?
- Does the solution solve a customer need?
For our part, we began the presentation with an animated video, that explored some of the pain points solved by Data Controller, such as:
- Batch stability
- End User Computing
- Audit investigations
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/M8hafkS4zY4" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
On the innovation side, we talked about some of the following unique capabilities we have built to deliver customer value:
- Ability to deploy without a web server (streaming app capability)
- The DevOps framework, which we have open-sourced ([SASjs](https://sasjs.io))
- Cross SAS-Platform technology (Viya, SAS 9, and soon - desktop SAS also)
As well as an overview of the list of features we have built, to extend the value of SAS Software:
- SAS 9[ Data Lineage diagrams](https://docs.datacontroller.io/videos/#data-lineage) that can be exported as PNG, SVG or CSV
- [Zero Code data capture](/5-zero-code-ways-to-import-excel-into-sas/) with data model protection
- [Row Level Security](/row-level-security/) for any SAS-connected table
- Browser based SAS 9 Metadata explorer
- [Dynamic Cell Dropdowns](https://docs.datacontroller.io/dynamic-cell-dropdown/)
- Automated [EUC capture](/euc-management-system/)
- [Excel Formula capture](https://docs.datacontroller.io/excel/)
- [Bitemporal](/bitemporal-historisation-and-the-sas-dds/) Uploads
## Data Controller for SAS® and Microsoft
We are now putting the badges to good use in our upcoming LinkedIn campaign!  [Analytium](https://sasapps.io) are both SAS and Microsoft partners, and Data Controller for SAS® is listed on the [Azure Marketplace](https://azuremarketplace.microsoft.com/en-us/marketplace/apps/analytiumltd1582389146376.datacontrollerforsas?tab=Overview).
These are the ads we will be running:
![](./capture_euc_sasaward_ms.png)
![](./lineage_sasaward_ms.png)
![](./model_sasaward_ms.png)
Data Controller Community Edition is free to use.  If you'd like to give it a whirl, just [pop us a message](/contact)!

Binary file not shown.

After

Width:  |  Height:  |  Size: 161 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 223 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 327 KiB

View File

@ -0,0 +1,100 @@
---
title: Siemens Healthineers Smart Data Catalog and Data Controller
description: Data Controller was implemented at Siemens Healthineers to facilitate their SAS-Powered Smart Data Catalog and enable Data Lineage reporting.
date: '2020-08-24 21:27:10'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './siemenshealthineers.png'
tags:
- Data Catalog
- Data Lineage
- Data Quality
- data warehouse
- Excel
- SAS
- Use Cases
- VBA
---
Data Controller was implemented at Siemens Healthineers to facilitate their SAS-Powered Smart Data Catalog and integrate with Data Lineage reporting. We are grateful to <a href="https://www.linkedin.com/in/helen-stark-5bb15b6a/">Helen Stark</a> (Power User) and <a href="https://www.linkedin.com/in/hans-juergen-kopperger-942634b7/">Hans-Juergen Kopperger</a> (SAS App Developer) for sharing their "before and after" experiences. The previous article in this series is available <a href="/der-touristik/">here</a>.
---
## Helen Stark[<img class="alignright wp-image-1171" src="/wp-content/uploads/2020/08/thumbnail_HStark.jpg" alt="" width="150" height="200" />](https://www.linkedin.com/in/helen-stark-5bb15b6a/)
### Hi Helen, its great to meet with you today. Can you tell us - what is your role within the business?
I am the portfolio manager. What I do is a lot of demand management. So when people have new requests, like they want new data into our data lake, or they want it structured, or they want it visualized then I manage all that. For the Americas region, for North and south America. And I also do road shows, like marketing kind of aspect for the data scientists. And I do a smidge of project management here and there.
### Interesting. And what do you actually use Data Controller for?
I use it to keep our Smart Data Catalog up to date. One of the things that I do is we have marketing posters, and so I put the links to those marketing posters in there so they are a part of our front end web design. And I also do the marketing videos, so adding those links. So really its just adding and deleting entries into Data Controller, so that our Smart Data Catalog is updated at all times.
### And that Smart Data Catalog is in SAS. Is there reason its in SAS, and not say - Excel? Is it used by other parts of the business?
Yes, its used by the entire business
### I see. And what is the Smart Data Catalog?
It's a listing of all of the offerings that data governance has. So, I say data governance but its data governance and analytical services. So, we are the data owners for all of the data in the company. Its the single source of truth for all Siemens health business data. And we use Data Controller to manage that. And Data Controller does some incredible things that I do not understand such as being able to stage and preview the data before its made publicly available. I mean I honestly dont understand it but it does some miraculous things.
### Nice feedback! Next question. How does Data Controller make your life easier?
Because it's so easy to use. Before we used an excel spreadsheet and it was quite unwieldy and it was bulky and it was so easy to make mistakes, just trying to remember where you were. And with Data Controller I love that I can filter first and get to exactly where I want to be and then I can edit. So, it really lessens the chance of me making a mistake. I love that, I am an editor, I am not an approver, but I like so I make an edit, but then it goes to an approver, so its like the four eye principle, something we did not have before. You can track changes, thats amazing. Yeah, it made everything easier so whereas before I would dread updating the Smart Data Catalog, now you go in and its done in like 3 to 5 minutes.
### Superb. Ok, before Data Controller came along, how did you get data into SAS?
So again, it was the foundation of this spreadsheet. You would have to check it out so there was some control over it. If somebody checked it out then nobody else could go in and make any changes, and you would have to wait for it to be checked back in and then run like a macro or some tool and then you could upload it and you could update your smart data search. It was a process that depended on how careful your co-workers were about remembering to check it back in. In short, it used to often take days to update the SAS environment, and now it takes minutes.
### Wow - thats great to hear! Ok, Next up. What are your favourite Data Controller features and why?
The filter. It will always be my favourite; it will never change. Thats really helpful. I just like how it makes it so much harder to make a mistake. With Data Controller its much harder to make a mistake. Its less prone to human error. And the copy and paste is so easy, and yeah, there is nothing about it that I dont love.
### Brilliant, I think thats the best feedback weve ever had.
Really, I mean it has made our lives so much easier, so much faster. Um yeah, I just love it.
### Thankyou so much!
---
## Hans-Juergen [<img class="alignright wp-image-1181" src="/wp-content/uploads/2020/08/0-1-1.jpeg" alt="Hans-Juergen" width="150" height="225" />](https://www.linkedin.com/in/hans-juergen-kopperger-942634b7/)
### Guten Morgen Hans-Juergen! Can you tell us a little about your role within the business?
Yes, I am working as a Data Integration Manager at Siemens Healthineers providing BI and Analytical Services for our colleagues. My department is the DGA - Data Governance and Analytical Services. At Siemens-Healthineers we are analyzing "Big Data" from our computer systems - AT Angiography and Therapy, Computer Tomography CT, MR Magnetresonanz Tomography, and LD Labordiagnostik.
In consequence of our Business Strategy "from Onsite to Online" our focus is to connect more and more systems to our BI backend. With new services like "Condition-based Maintenance" or "Predictive Analysis", we can now generate data-driven services to increase business values for our customers or even decrease our overall service costs.
Our BI platform is based on the SAS technology stack. To get our Business Analysts and Data Scientists nearer to our Data Lake I have created a Smart Data Catalog which is an interactive web application with a "google style" search facility. Now they can do their jobs more effectively without struggling to find and access accurate, complete, and trustworthy data. As a result, they spend less time searching for data and can actually focus on using data to generate analyses and impactful insights.
### Now that's valuable! And what do you use Data Controller for?
To increase Data Quality while uploading backend data into our BI platform. Data Controller provides “data version control”, and full traceability of changes. We have several control tables that provide data for web applications like the "Smart Data Catalog", and with Data Controller it's now an easy, controllable and manageable process to get these changes into the backend tables.
In the past we had a custom Stored Process web app for uploading excel files, based on an excel template. This process had negative consequences for Data Quality because it often happened that many different versions of the excel templates were created and in the end we didn't know which was the latest version and which values we wanted to upload into the system. It could take a lot of time to clarify who had done which upload.
I would often receive support tickets in relation to this upload, the cause of which was often due to the diversity of our excel templates, and being unsure which was the right template...So, we would have a lot of discussions about how to bring data into the backend in a controlled manner.
Then one day, I got information through <a href="https://sasusergroup.de/">SAS User Group Germany</a> that you provide a solution with Data Controller. I was initially interested in the <a href="https://docs.datacontroller.io/videos/#data-lineage">Data Lineage</a> functionality, but then I understood the main concept behind Data Controller. And for me the main benefit is that I can save a lot of time - with out of the box features like the web data editor, and the web upload facility with excel spreadsheet drag and drop. And there is the automatic workflow behind with the mandatory approval step. Since we implemented Data Controller, we no longer get those support tickets.
### Fantastic. If you had to pick your top features, what would they be?
The main benefit is getting data controlled, and into the backend. The controlled process, and the approval process, those are the main benefits. But we've got other benefits. For instance, we have Data Lineage now. The Data Lineage diagrams could also be linked directly from our Smart Data Catalog using URL parameters. This easy integration means our users can open the relevant page in Data Controller with one click.
The transparency of the history page is another benefit. I can look at every requested submit or approval - what changes have been applied, what changes have been submitted, and what changes have been approved. This helps us a lot to get data transparency.
The <a href="https://docs.datacontroller.io/emails/">email alerts</a> is a great feature. For the communication of changes, we had previously created a team's collaboration chat. e.g. if someone did a change and needed to request an approval. But with email alerts, the notification of changes is now automatically sent to the responsible data owner, who can immediately click the email link and do his approval. This speeds up the whole process.
Another advantage is the "database approach" for updates. So, someone is changing one row in a table which is connected to his use case, another guy can change other rows of the same table, nearly simultaneously. Because not everyone is changing the same rows. Everyone has their own subset of rows, their own "workspace" within one table. In the past we would have one excel template, and this would always override all values. We would have a lot of excel templates going around our colleagues, so there were always conflicts of overrides and versioning, and stuff like that. With Data Controller, it's now a simple, easy and transparent data capture process.
### Vielen Dank!
---
<div class="imgHolder">
<a href="/blog">
<img class="wp-image-1190 size-large aligncenter" src="/wp-content/uploads/2020/08/Get-Started-Smart-Data-Catalog.png" alt="Smart Data Catalog" />
<div>Smart Data Catalog</div>
</a>
</div>

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.2 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 723 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.1 MiB

View File

@ -0,0 +1,94 @@
---
title: "v4.0 Release: Formats & Special Missings"
description: This release provides support for viewing/editing Format Catalogs, plus the ability to work with special SAS missing numerics (.a, .b etc)
date: '2022-03-07 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './formats.png'
tags:
- Formats
- Releases
- Special Missings
---
We're excited to announce the new features in the v4 release:
* Ability to VIEW and EDIT values directly in Format Catalogs
* Ability to VIEW and EDIT special SAS missing numeric values (.a, .b etc)
* Audit table (single source for all data modifications in DC)
* Additional metadata on table viewer (Primary Key, labels, lengths, formats)
* Formatted / Unformatted switch on the DIFF screen
As well as the following fixes:
* Identical filter clauses on different tables being mis-assigned (this is the breaking change)
* Filter clauses applied to Excel / CSV / Datalines downloads
* Data values may now contain _leading_ blanks
* Customers may now have a single licence for multiple site ids
* Support for target table record deletion in Excel Uploads
We've also completely refactored the underlying engine for [Dynamic Cell Dropdowns](https://docs.datacontroller.io/dynamic-cell-dropdown) to make it faster, more robust, and easier to test.
We're also pleased to report that all the backend SAS Services now run with [strict mode](https://core.sasjs.io/mp__init_8sas.html) enabled, ensuring that SAS will 'fail fast' in case of data issues.
Some more background on our major features:
## View & Edit Format Catalogs
Data Controller allows you to view and edit formats _directly_ in the catalog. It does not require the maintenance of a secondary "cntlin" table! This ensures that the formats you are viewing / editing are indeed, the latest ones.
All of the usual Data Controller features are available for formats too, including:
* Configure HOOK scripts to run before / after a change or approval
* Locking mechanism to avoid issues with parallel updates
* Make changes in the web, or via Excel or CSV upload
* Edit individual format entries on dedicated screen
* Create complex filters (and save filtered views)
* Full-table (catalog) search for specific values
* Download in Excel, CSV or Datalines formats
* Export the DDL (in various flavours)
* Mark format entries for deletion
![formats](edit_format_record.png)
Information on configuration is available in the [documentation](https://docs.datacontroller.io/formats)
## View & Edit SAS Special Missing Numerics
Did you know that, in addition to a regular missing value in SAS (`.`), there are 27 other types of missing? They are represented by the letters a-z and an underscore (`._`).
These values can now be both viewed and edited in Data Controller following an update to the [SASjs Adapter](https://github.com/sasjs/adapter#variable-types).
`video: [Retain Formulas when Loading Excel to SAS](https://www.youtube-nocookie.com/embed/ggrcNr23Jzw)`
There is nothing extra to configure for special SAS numerics - they are simply available by default, for numeric cells.
## Audit History Table
Previously, transactional changes made to tables in Data Controller could only be tracked by means of individual CSV files. A user could (and still can) navigate to the HISTORY tab, find their change, and download a zip file containing all relevant information such as the original excel that was uploaded, SAS logs, the changed records (CSV) and the staging dataset.
The issue is that this information did not capture the original, unchanged records (necessary to support rollbacks), nor did it provide a central point for querying the entire change history of a particular variable / record over time.
These issues are now resolved with the introduction of the MPE_AUDIT table.
![audit](audit.png)
The macro used to load this table is open source and available [here](https://core.sasjs.io/mp__storediffs_8sas.html).
## Roadmap (7th March 2022)
We continue to update and improve Data Controller. Upcoming features include:
* PK highlighting for VIEW as well as EDIT tables
* Data Rollback
* Ability to configure individual audit tables (ie, 1 per EDIT table)
* Data Controller API
* Data Controller on Viya 4
* Data Controller on Desktop SAS (using [sasjs/server](https://github.com/sasjs/server))
<hr>
Did you know Data Controller Community Edition is free to use? [Contact us](/contact) for your copy!

Binary file not shown.

After

Width:  |  Height:  |  Size: 678 KiB

View File

@ -0,0 +1,132 @@
---
title: "v5.0 Release: Column Level Security"
description: Data Controller now supports Column Level Security (in both VIEW and EDIT mode) as well as a number of other fixes and improvements
date: '2022-07-11 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './cls_example.png'
tags:
- Data Catalog
- Data Lineage
- Releases
- Special Missings
---
# v5 Release
The big news items for Data Controller since the last release are:
* Support for Viya 4
* Support for Base SAS (on [SASjs Server](https://server.sasjs.io))
* Column Level Security (choose VIEWable or EDITable columns for specific groups)
Version 5 also brings some additional goodies!
* One click Data Catalog and Table Lineage refresh on install
* Support for Swedish (and other) Locales
* Deploy Demo DC _without_ Admin Rights
* Configurable Audit History Location
* Password Protected Excel Import
* DC_RESTRICT_EDITRECORD option
* Submit reason in HISTORY
* Shaded PK in VIEW
## New Features
### Column Level Security
Thanks to the new MPE_COLUMN_LEVEL_SECURITY table it is possible to configure:
* Which columns should be VISIBLE in VIEW mode
* Which columns should be EDITABLE in EDIT mode
* Which columns should be HIDDEN in EDIT mode
Rules apply to the specified [DC Groups](https://docs.datacontroller.io/dcc-groups/). When this mode is activated, it is **not** possible to:
* Add or Delete records
* Modify the Primary Key
* Upload files (eg CSV or Excel)
`video: [Column level Security with Data Controller for SAS](https://www.youtube-nocookie.com/embed/jAVt-omtjVc)`
More information is available in the [documentation](https://docs.datacontroller.io/column-level-security/).
### One click Data Catalog and Table Lineage on Install
Previously it was necessary to dig out the relevant services to refresh the Data Catalog (all versions) or the Table Lineage (SAS 9 EBI only).
We now present links during the deploy process to speed up the initialisation of these features.
### Support for Swedish (and other) Locales
In Viya and SASjs Server, the session encoding is always UTF-8. For SAS 9 EBI however, the session can vary depending on the Locale - from WLATIN1 through to WLATIN9 and beyond.
Thanks to some updates in our [JSON generator](https://core.sasjs.io/mp__jsonout_8sas_source.html) we can now support UTF-8 outputs even where the SAS session is not UTF-8.
### Deploy Demo DC _without_ Admin Rights
This has always been possible in Viya and SASjs Server (assuming you can write to the necessary folders) but for SAS 9 EBI it was previously necessary to have SAS Management Console or Data Integration Studio (or access to Batch Tools) in order to import the SPK.
Our new deployment process for SAS 9 EBI simply involves running a SAS Program, meaning that you can deploy to any folder in metadata (as a streaming app).
For a full deploy though, you would still need access to the Web Server, in order to deploy the frontend...
### Configurable Audit History Location
Since the [version 4 release](/v4-0-formats-special-missings/) we have been capturing ALL change history in a single audit table.
This has resulted in some voluminous output!
It is now possible (on a per-table basis) to [configure](https://docs.datacontroller.io/dcc-tables/#audit_libds) alternative audit tables, or to switch the feature off completely.
### Password Protected Excel Import
If your Excel is password protected, just provide the password during import to unlock and ingest it. More information [here](https://docs.datacontroller.io/videos/#uploading-a-password-protected-excel-file).
### DC_RESTRICT_EDITRECORD Option
We are informated that sometimes you would like to have the option to have FEWER options for inputting data! Who are we to argue. You can now disable the 'EDIT RECORD' dialog using [this option](https://docs.datacontroller.io/dcc-options/#dc_restrict_editrecord).
### Submit Reason in HISTORY
We have replaced the "groups" column in the SUBMITTED / APPROVE / HISTORY tabs with 'SUBMIT_REASON' - which is even more reason to describe your submits!
### Shaded PK in VIEW
Using [this macro](https://core.sasjs.io/mp__getpk_8sas.html) we now automagically colour the primary key fields in the VIEW menu.
The logic differs from the EDIT menu in that it looks for an actual, UNIQUE + MISSING index on the target table, as opposed to picking up the [BUSKEY](https://docs.datacontroller.io/dcc-tables/#buskey) from MPE_TABLES.
## Bugfixes
We zapped a few of these, notable ones:
* Multiple Approver workflow not working
* Special Missings in cell dropdowns
* Special Missings in filters (BETWEEN + IN)
* Special Missings in CSV uploads
* Support for leading blanks in Excel uploads
* Improved Streaming App support under strict CSP policy
## Model Changes
We've made the following changes to the data model:
* Dropped MPE_APPROVALS table
* Added MPE_SUBMIT table
* Added MPE_COLUMN_SECURITY table
* Dropped HELPFUL_LINK from MPE_TABLES
* Added AUDIT_LIBDS to MPE_TABLES
## Roadmap (11th July 2022)
The following items are on our radar for 2022:
* Admin Screen
* Data Rollback (from UI)
* Data Controller API
If you'd like to see something extra or something else entirely, you can also engage us to build it for you! Our team specialises in [SAS App Development](https://sasapps.io).

Binary file not shown.

After

Width:  |  Height:  |  Size: 250 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 255 KiB

View File

@ -0,0 +1,81 @@
---
title: "v5.1 Release: Library & Dataset Info"
description: Spare yourself the effort of coding up a dictionary query or proc contents - you can now view dataset and library info directly in the Data Controller interface.
date: '2022-09-02 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './library-info.png'
tags:
- Releases
---
# v5.1 Release
The new features in this release are:
* Refresh the Data Catalog on a per-library basis
* View the Data Catalog info in a new, linkable, 'library info' screen
* View dataset information (proc contents)
* New Forward & Back option in the Edit Record modal
* Tidying / publishing of [source code](https://code.datacontroller.io)
Various fixes have also been delivered, including:
* Removed error dialog when a library is empty
* Loading of formats with different length settings
* Suppress ABORT modal when dynamic filtering
* Dynamic retrieval of Server Context when deploying on SAS 9
* Fixes to various edge cases in the FILTER dialog
* Display correct number of actual changed records (Previously capped at 200, the display amount)
* Dropdown values on modified Edge instance are no longer "sticking" (we built a custom modal)
* Fixed issue with `NaN` being displayed when copy/pasting in certain contexts
There is also a significant speed improvement when working with wide tables (hundreds of variables).
## New Features
### Data Catalog Library Refresh
Previously the only way to refresh the Data Catalog was to run the `services/admin/refreshcatalog` service. Now, any user can refresh it using the refresh button for a specific LIBREF in the VIEW menu.
![](refreshlib.png)
### Data Catalog Library Info
It is now possible to view the library info on a new (linkable) library page. Useful metrics such as the engine used, the physical size of the library (if BASE), permissions, table count etc are surfaced.
![](library-info.png)
### Dataset Info
Using the [ds_meta](https://core.sasjs.io/mp__dsmeta_8sas.html) macro a large number of dataset specific metrics are now available on both VIEW and EDIT pages.
This is a handy place to check for common issues such as a lack of dataset compression, or the presence of a significant number of logically deleted records.
![](dsinfo.png)
![](dsinfo2.png)
### Forwards & Back
You can now move forwards and backwards through observations in the Edit Record modal, FSEDIT style!
`video: [Edit Record Modal in Data Controller for SAS](https://www.youtube-nocookie.com/embed/phXtIXhI_3k)`
### Data Controller Source Code
To assist users, admins and developers with understanding exactly what is going on insde the Data Controller services, we have published the source code to [https://code.datacontroller.io](https://code.datacontroller.io). The programs are documented using Doxygen and [sasjs doc](https://cli.sasjs.io/doc).
## Bug Fixes
We deployed Data Controller to the SAS environment of a Government customer this year, and we're grateful to them for performing some very extensive and intensive testing! A fairly large number of issues were found and fixed over the last 3-4 months and so an upgrade is highly recommended for all customers. The deployment page is [here](https://4gl.uk/dcdeploy).
## Roadmap (3rd September 2022)
The following items are on our 'would like to build' list. If you're interested in any of these, and there's a commercial opportunity involved, then - let's [chat](https://dataacontroller.io/contact)!
* Admin Screen
* Data Rollback (from UI)
* Data Controller API

Binary file not shown.

After

Width:  |  Height:  |  Size: 126 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.9 MiB

View File

@ -0,0 +1,34 @@
---
title: "v5.2 Release: Lineage Updates"
description: The lineage in SAS 9 EBI is a beauty to behold. We bring it to your browser, and it's now better than ever!
date: '2022-11-16 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './lineage.png'
tags:
- Releases
---
# v5.2 Release
Many of you are telling us that your favourite feature in Data Controller is the Data Lineage explorer!
And recently it's been getting battle tested in some large environments, with sizeable lineage trees. To avoid the need to wait for these trees to render, we've now added the ability to limit lineage depth _before_ generating the lineage.
![](collineage.png)
We've also spent time improving the responsiveness of Data Controller to suit different screen sizes. We've reduced the number of menu items from 5 to 3, and adjusted the way the menu works when Data Controller is accessed from devices such as mobile or tablet.
Furthermore, the following fixes have been deployed:
* Prevent hanging in column lineage when library id not found
* Fix issue with "max_depth" not limiting depth in some cases
* Fix issue with PK fields not shown in dictionary tables
* Escaping of ampersands in file / table lineage
* Fix alignment of the "library info" screen
Finally, since the last release, we also made a demo instance of Data Controller public. It makes use of the mocking capabilities of [SASjs Server](https://server.sasjs.io) (backend is JS-only).
You can try it out here: [https://demo.datacontroller.io](https://demo.datacontroller.io)

Binary file not shown.

After

Width:  |  Height:  |  Size: 32 KiB

View File

@ -0,0 +1,30 @@
---
title: "v5.3 Release: ViewBoxes"
description: Compare records against multiple tables on the same screen, using ViewBoxes
date: '2023-02-24 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './viewbox.png'
tags:
- Releases
---
# v5.3 Release
Have you ever examined a record in VIEW or EDIT and needed to compare against data values in other tables (or even other places in the _same_ table)?
With viewboxes, you can now create filtered views (both at column and record level) of up to 6 other tables and arrange them on the same screen.
![](viewbox_edit.png)
The tables, filters, and grid position are all saved in the URL, so you can share your view with colleagues. Primary key fields (determined by column constraints) are shaded. Column ordering and visibility is configurable. If the table is editable, it can be opened in edit mode in a new window.
`video: [ViewBoxes in Data Controller for SAS](https://www.youtube-nocookie.com/embed/jPGkLRspODM)`
As part of this release we also upgraded the [HandsOnTable](https://handsontable.com/customers/datacontroller) library, which allows us surface a new feature - "copy with header rows". This can be found in the right click menu.

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.0 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 472 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

View File

@ -0,0 +1,111 @@
---
title: "v6.0 Release: Viya API Explorer"
description: Data Controller community tier now includes an API explorer! We've also overhauled the (in)format ingestion capability, and revamped our pricing (now with unlimited users across all tiers).
date: '2023-06-26 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './previewapi.png'
tags:
- Releases
---
# v6.0
With a (Viya) API explorer, an overhauled (in)format ingestion capability, and now with unlimited users - Data Controller v6 is a major release indeed!
## Viya API Explorer
Following on from the metadata explorer (SAS 9 EBI feature) we have been looking to provide a similar capability for Viya. And so, we built the API explorer!
This lets you easily trigger the (GET) APIs and explore the responses without having to break open Postman or another development toolkit. Here's an example of opening a Job and examining the SAS code:
<iframe title="Browsing Viya API in Data Controller" width="560" height="315" src="https://vid.4gl.io/videos/embed/e284f815-a6dc-4998-80bd-152d54cb81a9?title=0" frameborder="0" allowfullscreen="" sandbox="allow-same-origin allow-scripts allow-popups"></iframe>
Here we grab the raw JSON for pasting into VS Code:
<iframe title="Grabbing JSON from Viya APIs with Data Controller" width="560" height="315" src="https://vid.4gl.io/videos/embed/18914633-342b-48f1-9021-bb01a8b33198?title=0&amp;warningTitle=0" frameborder="0" allowfullscreen="" sandbox="allow-same-origin allow-scripts allow-popups"></iframe>
And here we toggle the start / limit parameters to bring back more values:
<iframe title="Adjusting the start and limit params in the Data Controller Viya API Explorer" width="560" height="315" src="https://vid.4gl.io/videos/embed/29cc7a32-75c5-4cd5-8938-b1a7d0e1575d?title=0" frameborder="0" allowfullscreen="" sandbox="allow-same-origin allow-scripts allow-popups"></iframe>
We would love YOUR feedback as to how we can extend this API explorer to make it an even more useful tool!
## Unlimited Users
If you've been following us for a while you've probably heard the '5 users free' tagline. Well - you will hear it no more, as we now offer **unlimited users for all tier levels**!
That's right, you can download Data Controller (Community Edition) and use it across your entire enterprise TODAY, without spending a penny.
If, however, you would like priority support and full access to all features, we ask that you engage us on <a href="https://datacontroller.io/pricing">paid subscription plan</a>.
## (IN)FORMAT Capabilities
Previously we only supported ingestion of run-of-the-mill SAS formats. Following customer feedback, we have now expanded this capability to include:
* Informats
* Multilabel Formats
* NotSorted Formats
The addition of these format types broke the data model we were using previously for holding format data. We had incorrectly assumed that the CNTLOUT dataset could be keyed on TYPE, FMTNAME and START.
In fact, START can be null, and the format data can have complete duplicates (multilabel). Furthermore, the _order_ of records is important (notsorted). Therefore we have applied a new key (TYPE, FMTNAME, FMTROW) where FMTROW is the index of the record of the format in question.
This means if you insert a row in a format, Data Controller will see this as a CHANGE to all the rows underneath (if they are not duplicates). This difference in behaviour, as well as the the change in the model, is the "breaking change" in this release (hence major version bump). It will likely only affect you though if you are using Excel or CSV to upload (in)format data.
This primary key (TYPE, FMTNAME, FMTROW) is now also indicated in VIEW mode.
![](./formats.png)
## Admin Screen
We've added a new screen (under the username dropdown) to show system details as well as a handy set of shortcut buttons for refreshing the data catalog and downloading configuration files.
![](./admin.png)
This screen is also available for regular users (those not in the Data Controller admin group), just without the additional buttons.
## Load More Values
We've added the ability to 'load more' history on the history page, as well as the ability to [show more history by default](https://docs.datacontroller.io/dcc-options/#history_rows)
"More Values" can now also be requested from the selection dropdowns when creating data filters.
![](./loadmore.png)
## Fixes
Some of the issues we've zapped:
* Enable data-catalog refresh of a single library when invalid libraries are present
* Prevent error when attempting an UNLOCK of an already-unlocked table
* Show Viya avatar when web app is served from a different domain
* Bug with delete-only uploads not appearing in the audit table
* Show special missing values on VIEW screen
## Roadmap
Looking to the future, we are actively tidying up the codebase to publish it as 'source-available' (the source is already available to existing customers). We are also investigating the HandsOnTable "Formula" feature to see if we can implement it on the EDIT grid.
If you would like to see any new features in DC, or would like to kick the tyres and give it a whirl, do [get in touch](https://datacontroller.io/contact)!

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.1 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 270 KiB

Binary file not shown.

Binary file not shown.

After

Width:  |  Height:  |  Size: 52 KiB

View File

@ -0,0 +1 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 640 640" style="enable-background:new 0 0 640 640" xml:space="preserve" width="32" height="32"><path style="fill:#fff" d="m395.9 484.2-126.9-61c-12.5-6-17.9-21.2-11.8-33.8l61-126.9c6-12.5 21.2-17.9 33.8-11.8 17.2 8.3 27.1 13 27.1 13l-.1-109.2 16.7-.1.1 117.1s57.4 24.2 83.1 40.1c3.7 2.3 10.2 6.8 12.9 14.4 2.1 6.1 2 13.1-1 19.3l-61 126.9c-6.2 12.7-21.4 18.1-33.9 12z"/><path style="fill:#609926" d="M622.7 149.8c-4.1-4.1-9.6-4-9.6-4s-117.2 6.6-177.9 8c-13.3.3-26.5.6-39.6.7v117.2c-5.5-2.6-11.1-5.3-16.6-7.9 0-36.4-.1-109.2-.1-109.2-29 .4-89.2-2.2-89.2-2.2s-141.4-7.1-156.8-8.5c-9.8-.6-22.5-2.1-39 1.5-8.7 1.8-33.5 7.4-53.8 26.9C-4.9 212.4 6.6 276.2 8 285.8c1.7 11.7 6.9 44.2 31.7 72.5 45.8 56.1 144.4 54.8 144.4 54.8s12.1 28.9 30.6 55.5c25 33.1 50.7 58.9 75.7 62 63 0 188.9-.1 188.9-.1s12 .1 28.3-10.3c14-8.5 26.5-23.4 26.5-23.4S547 483 565 451.5c5.5-9.7 10.1-19.1 14.1-28 0 0 55.2-117.1 55.2-231.1-1.1-34.5-9.6-40.6-11.6-42.6zM125.6 353.9c-25.9-8.5-36.9-18.7-36.9-18.7S69.6 321.8 60 295.4c-16.5-44.2-1.4-71.2-1.4-71.2s8.4-22.5 38.5-30c13.8-3.7 31-3.1 31-3.1s7.1 59.4 15.7 94.2c7.2 29.2 24.8 77.7 24.8 77.7s-26.1-3.1-43-9.1zm300.3 107.6s-6.1 14.5-19.6 15.4c-5.8.4-10.3-1.2-10.3-1.2s-.3-.1-5.3-2.1l-112.9-55s-10.9-5.7-12.8-15.6c-2.2-8.1 2.7-18.1 2.7-18.1L322 273s4.8-9.7 12.2-13c.6-.3 2.3-1 4.5-1.5 8.1-2.1 18 2.8 18 2.8L467.4 315s12.6 5.7 15.3 16.2c1.9 7.4-.5 14-1.8 17.2-6.3 15.4-55 113.1-55 113.1z"/><path style="fill:#609926" d="M326.8 380.1c-8.2.1-15.4 5.8-17.3 13.8-1.9 8 2 16.3 9.1 20 7.7 4 17.5 1.8 22.7-5.4 5.1-7.1 4.3-16.9-1.8-23.1l24-49.1c1.5.1 3.7.2 6.2-.5 4.1-.9 7.1-3.6 7.1-3.6 4.2 1.8 8.6 3.8 13.2 6.1 4.8 2.4 9.3 4.9 13.4 7.3.9.5 1.8 1.1 2.8 1.9 1.6 1.3 3.4 3.1 4.7 5.5 1.9 5.5-1.9 14.9-1.9 14.9-2.3 7.6-18.4 40.6-18.4 40.6-8.1-.2-15.3 5-17.7 12.5-2.6 8.1 1.1 17.3 8.9 21.3 7.8 4 17.4 1.7 22.5-5.3 5-6.8 4.6-16.3-1.1-22.6 1.9-3.7 3.7-7.4 5.6-11.3 5-10.4 13.5-30.4 13.5-30.4.9-1.7 5.7-10.3 2.7-21.3-2.5-11.4-12.6-16.7-12.6-16.7-12.2-7.9-29.2-15.2-29.2-15.2s0-4.1-1.1-7.1c-1.1-3.1-2.8-5.1-3.9-6.3 4.7-9.7 9.4-19.3 14.1-29-4.1-2-8.1-4-12.2-6.1-4.8 9.8-9.7 19.7-14.5 29.5-6.7-.1-12.9 3.5-16.1 9.4-3.4 6.3-2.7 14.1 1.9 19.8l-24.6 50.4z"/></svg>

After

Width:  |  Height:  |  Size: 2.2 KiB

View File

@ -0,0 +1,72 @@
---
title: "v6.1 Release: Source Available"
description: Data Controller source code is now freely available for anyone to build and evaluate. We also enabled full deletion of formats, and reduced the audit data volumes (whilst retaining full change history).
date: '2023-07-25 09:00:00'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './gitea_logo.png'
tags:
- Releases
---
# v6.1
This will be our last ever release (blog) post - sniff sniff - actually it's very good news, because we're moving to a fully automated and PUBLIC release system!
Here you can find the notes (and assets) from the 6.1 release: [https://git.datacontroller.io/dc/dc/releases/tag/v6.1.0](https://git.datacontroller.io/dc/dc/releases/tag/v6.1.0).
In addition we've added the ability to fully delete formats from catalogs, reduced the volume of audit data, and zapped a couple of smaller bugs too.
## Source Available
Transparency is very important to us, as a team, and also for you - as you are trusting our software inside one of your most strategic platforms. Whilst we have had a 'source available' policy for several years now, it has been on a private invite / request basis. With version 6.1 it is now possible for ANY customer of SAS to freely evaluate our software without having to trust our build, or even to talk to us - you can create a release yourself, directly from the source repository, available here: [https://git.datacontroller.io/dc/dc](https://git.datacontroller.io/dc/dc). All the steps can be viewed in the project [release.yaml](https://git.datacontroller.io/dc/dc/src/branch/main/.gitea/workflows/release.yaml).
The source is available on a [dual licence](https://git.datacontroller.io/dc/dc/src/branch/main/LICENCE.md) (the same as our OEM-licenced grid system).
![](./gitea_logo.png)
We will continue to publish the SAS code in doxygen form at [https://code.datacontroller.io](https://code.datacontroller.io). If you would like the ability to raise issues, or would like to submit a pull request, do get in touch via support@datacontroller.io and we will gladly create a user account for you.
## Format Deletion
Unlike data in regular tables, formats must be modified and reloaded to catalogs in their entirety. Our previous approach for deletions was to export the format, remove the offending rows, and reload the catalog.
This is problematic when every row is marked deleted, as there is nothing to reload. The [fix](https://github.com/sasjs/core/pull/342) was made in the underlying SASjs Core macro ([mp_loadformat.sas](https://core.sasjs.io/mp__loadformat_8sas.html)) - now, when **all** format records are removed, `proc format` is invoked with the `delete` statement against the relevant formats in the relevant catalog.
## Reduced Audit Data
Previously when loading an [audit table](https://docs.datacontroller.io/tables/mpe_audit/), we always included the entire row - including values that have not changed.
This resulted in some very large audit tables, especially for tables with hundreds of columns!
To limit data volumes, audit data is now _excluded_ when `MOVE_TYPE="M"` (modified record), `IS_PK=0` (not a primary key column) and `IS_DIFF=0` (no change to the value).
We will continue to post the full record where `MOVE_TYPE in ("A","D")` (added/deleted) so that the table state can be recovered from a backup of the table, or reverted back from a modified table.
## Fixes
Bugs we've blasted:
* Frontend rejection of excel uploads with duplicates on the primary key
* Missing `mf_existds()` macro issue when refreshing Table Lineage (SAS 9 EBI) or regenerating the Table Catalog (all versions).

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.3 KiB

View File

@ -0,0 +1,62 @@
---
title: Version 3.11 - Release Notes (Redshift, Locale, Proc Transpose)
description: V3.11 of Data Controller includes Amazon Redshift support, a Locale switch, Data Lineage for Proc Transpose, and varoius UI & Performance enhancements.
date: '2021-02-14 13:58:57'
author: 'Allan Bowe'
authorLink: https://www.linkedin.com/in/allanbowe/
previewImg: './1_5i1_LPEiMqqEuAmYhcmcIw.png'
tags:
- Redshift
- Releases
- Data Lineage
---
Following a busy few months, a number of new deployments and feedback from several customers we are <noframes></noframes>w ready to release version 3.11 of Data Controller for SAS©. The biggest news in terms of updates is the addition of the licence key.
## Features
- REDSHIFT support - we now provide native pass-through access to Amazon Redshift for all load types (Replace, Update, SCD2 and Bitemporal). So you can safely modify that 100 million row table! Or use our generic macro to perform incremental loads as part of your batch process.
- LOCALE switch - you can now explicitly set your locale (can be useful when importing CSV or Excel as the loader uses anydtdtme. informats)
- DATA LINEAGE for PROC TRANSPOSE - previously our SAS 9 column level lineage diagrams did not support this transform. Now it does! With thanks to [Hans-Juergen](/siemens-healthineers-smart-data-catalog/) for his contribution.
- Reason message in emails - previously we were not showing the SUBMIT or REJECT messages in the notification emails. Now, we do.
- Drag & Drop Excel or CSV directly over the table to upload (no need to click the 'upload' button any more)
- Separator on the number of observations - handy when your table has billions of rows. 23,456,233,677 is easier to read than 23456233677!
- Discard button - in case you uploaded the wrong Excel
- Refresh button - no need to navigate away and back (or refresh the entire app) to refresh the data
- Refactored and improved FILTER interface (now with datetime pickers)
- Streamlined Viya Deployment
## Fixes
- In the VIEWER, there was an issue where tables were not shown where the libref in metadata was in lowercase and library level permissions were applied. This is now fixed.
- There was an issue where the only 'PROCESSED' column name was 'PROCESSED_DTTM'. You can now designate any (numeric) column to be updated with the current timestamp when that record is modified.
- File downloads in Viya now work when deployed to contexts other than 'SAS Job Execution compute context'
- Column widths can now be modified when using the filtered view of the table
- ACCEPT / REJECT buttons are now disabled if the user is not in the APPROVE group
- The app no longer fails if a trailing slash is provided by the user in the DCPATH variable
- Fixed issue where arrow was not shown in the Handsontable dropdown for some selectbox types
- Updated User Navigator in Viya to show additional user specific info
## Redshift
The loading routine for Redshift makes use of configurable bulkload options via the MPE_CONFIG table. The MPE_DATALOADS table tracks every time a table is loaded, showing the number of records added, removed, and modified - along with details of who made the update, when, a reason code, and which table was modified. Full table copies are avoided by performing SQL pass through in SAS©.
## Other areas
We've also made significant strides in our DevOps thanks to the [SASjs](https://sasjs.io) framework, brought the product into the [Analytium](https://sasapps.io) fold (UK SAS Partner), and - in addition to a standard pricing - we have put together a reseller pack. Again, contact [Allan Bowe](https://www.linkedin.com/in/allanbowe/) for further details. As a reminder, Data Controller features include:
- Ability to upload any Excel or CSV file to any SAS© registered table
- SAS 9 Table & Column level lineage with SVG & PNG export
- Data Edit & Approval workflow with Audit history
- Export in CSV, Excel, SAS Datalines, DDL format
- User Navigator (Viya + SAS 9)
- SAS 9 Metadata Navigator
- Data Quality rules
- Data Dictionary
- Data Catalog
- Data Alerts
Thanks to SAS/ACCESS engines we can support all major databases, eg Netezza, Teradata, SPDE, Postgres, SQL Server, Redshift etc.
Further details on the [pricing](/pricing) page, else contact [Allan Bowe](https://www.linkedin.com/in/allanbowe/).

10
gatsby-browser.js Normal file
View File

@ -0,0 +1,10 @@
import "bootstrap/dist/css/bootstrap.min.css";
// Import all js dependencies.
import "jquery/dist/jquery.min.js";
import "bootstrap/dist/js/bootstrap.min.js";
import "@popperjs/core/dist/umd/popper.min.js";
// import 'bootstrap/js/dist/js'
// import 'bootstrap/js/dist/util'
// import 'bootstrap/js/dist/carousel'
// import 'bootstrap/js/dist/dropdown'

189
gatsby-config.js Normal file
View File

@ -0,0 +1,189 @@
module.exports = {
siteMetadata: {
title: `Data Controller | Flexible and Secure SAS® Data Modification`,
description: `Data Controller for SAS® is dedicated to helping users, admins and developers manage their data. A zero code approach with Data Lineage, Catalog, Dictionary, Validation, Workflow, Alerts and more.`,
siteUrl: 'https://datacontroller.io/',
author: {
name: `Allan Bowe`,
summary: ``
},
social: {
linkedin: `https://www.linkedin.com/showcase/data_controller/`
}
},
pathPrefix: '',
plugins: [
'gatsby-plugin-styled-components',
'gatsby-plugin-image',
'gatsby-plugin-react-helmet',
'gatsby-plugin-sitemap',
{
resolve: 'gatsby-plugin-manifest',
options: {
name: `Data Controller | Flexible and Secure SAS® Data Modification`,
short_name: `Data Controller`,
description: `Data Controller for SAS® is dedicated to helping users, admins and developers manage their data. A zero code approach with Data Lineage, Catalog, Dictionary, Validation, Workflow, Alerts and more.`,
homepage_url: 'https://datacontroller.io/',
start_url: '/',
background_color: '#fff',
theme_color: '#314351',
display: 'standalone',
icon: 'src/images/favicon.png',
icon_options: {
purpose: `maskable`
},
cache_busting_mode: 'none'
}
},
{
resolve: 'gatsby-plugin-matomo',
options: {
siteId: 3,
matomoUrl: 'https://analytics.4gl.io/',
siteUrl: 'https://datacontroller.io/',
},
},
{
resolve: `gatsby-source-filesystem`,
options: {
path: `./content/blog`,
name: `blog`
},
__key: 'blog'
},
{
resolve: `gatsby-source-filesystem`,
options: {
name: `markdown-pages`,
path: `./src/markdown-pages`
},
__key: 'markdown-pages'
},
{
resolve: 'gatsby-transformer-remark',
options: {
plugins: [
{
resolve: 'gatsby-remark-embed-video',
options: {
width: 750,
related: false //Optional: Will remove related videos from the end of an embedded YouTube video.
}
},
{
resolve: `gatsby-remark-images`,
options: {
maxWidth: 630
}
},
{
resolve: 'gatsby-remark-responsive-iframe',
options: {
wrapperStyle: 'margin-bottom: 1.0725rem'
}
}
]
}
},
'gatsby-plugin-sharp',
'gatsby-transformer-sharp',
{
resolve: 'gatsby-source-filesystem',
options: {
name: 'images',
path: './src/images/'
},
__key: 'images'
},
{
resolve: 'gatsby-source-filesystem',
options: {
name: 'pages',
path: './src/pages/'
},
__key: 'pages'
},
{
resolve: `gatsby-plugin-google-fonts`,
options: {
fonts: [`Montserrat\:300,400,500`],
display: 'swap'
}
},
{
resolve: 'gatsby-plugin-local-search',
options: {
// A unique name for the search index. This should be descriptive of
// what the index contains. This is required.
name: 'blog',
// Set the search engine to create the index. This is required.
// The following engines are supported: flexsearch, lunr
engine: 'flexsearch',
// Provide options to the engine. This is optional and only recommended
// for advanced users.
//
// Note: Only the flexsearch engine supports options.
engineOptions: 'speed',
// GraphQL query used to fetch all data for the search index. This is
// required.
query: `
{
remark: allMarkdownRemark (filter: {fileAbsolutePath: {regex: "/content/blog/"}}) {
posts: edges {
post: node {
id
html
fields {
slug
}
frontmatter {
title
date(formatString: "MMMM DD, YYYY")
author
authorLink
previewImg {
childImageSharp {
gatsbyImageData(layout: CONSTRAINED)
}
}
}
}
}
}
}
`,
// Field used as the reference value for each document.
// Default: 'id'.
ref: 'id',
// List of keys to index. The values of the keys are taken from the
// normalizer function below.
// Default: all fields
index: ['title', 'html'],
// List of keys to store and make available in your UI. The values of
// the keys are taken from the normalizer function below.
// Default: all fields
// store: ['id', 'path', 'title'],
// Function used to map the result from the GraphQL query. This should
// return an array of items to index in the form of flat objects
// containing properties to index. The objects must contain the `ref`
// field above (default: 'id'). This is required.
normalizer: ({ data }) =>
data.remark.posts.map((data) => ({
id: data.post.id,
slug: data.post.fields.slug,
title: data.post.frontmatter.title,
date: data.post.frontmatter.date,
previewImg: data.post.frontmatter.previewImg,
html: data.post.html
}))
}
}
]
}

255
gatsby-node.js Normal file
View File

@ -0,0 +1,255 @@
const path = require(`path`)
const _ = require('lodash')
const { createFilePath } = require(`gatsby-source-filesystem`)
const recentPosts = []
const archives = {}
const tagsFrequent = []
exports.createPages = async ({ graphql, actions, reporter }) => {
const { createPage } = actions
// Define a template for blog post
const blogPostTemplate = path.resolve(`./src/templates/blog-post.tsx`)
const blogListTemplate = path.resolve(`./src/templates/blog-list.tsx`)
const blogSearchTemplate = path.resolve(`./src/templates/blog-search.tsx`)
// Get all markdown blog posts sorted by date
const result = await graphql(
`
{
allMarkdownRemark(
sort: { fields: [frontmatter___date], order: DESC }
limit: 1000
filter: { fileAbsolutePath: { regex: "/content/blog/" } }
) {
nodes {
id
fields {
slug
}
frontmatter {
title
date(formatString: "YYYY")
}
}
}
tagsGroup: allMarkdownRemark(limit: 1000) {
group(field: frontmatter___tags) {
name: fieldValue
totalCount
}
}
}
`
)
if (result.errors) {
reporter.panicOnBuild(
`There was an error loading your blog posts`,
result.errors
)
return
}
const posts = result.data.allMarkdownRemark.nodes
recentPosts.push(
...posts.slice(0, 10).map((p) => ({
slug: p.fields.slug,
title: p.frontmatter.title
}))
)
const tags = result.data.tagsGroup.group
tagsFrequent.push(
...tags.sort((a, b) => b.totalCount - a.totalCount).slice(0, 10)
)
// side bar data for each page
posts.forEach((d) => {
if (archives[d.frontmatter.date] == null) archives[d.frontmatter.date] = 0
archives[d.frontmatter.date]++
})
// Create blog posts pages
// But only if there's at least one markdown file found at "content/blog" (defined in gatsby-config.js)
// `context` is available in the template as a prop and as a variable in GraphQL
if (posts.length > 0) {
posts.forEach((post, index) => {
const previousPostId = index === 0 ? null : posts[index - 1].id
const nextPostId = index === posts.length - 1 ? null : posts[index + 1].id
createPage({
path: post.fields.slug,
component: blogPostTemplate,
context: {
id: post.id,
archives,
recentPosts,
tags: tagsFrequent,
previousPostId,
nextPostId
}
})
})
}
// Create blog-list pages
const postsPerPage = 6
const numPages = Math.ceil(posts.length / postsPerPage)
Array.from({ length: numPages }).forEach((_, i) => {
createPage({
path: i === 0 ? `/blog` : `/blog/page/${i + 1}`,
component: blogListTemplate,
context: {
page: 'index',
archives,
recentPosts,
tags: tagsFrequent,
filter: { fileAbsolutePath: { regex: '/content/blog/' } },
limit: postsPerPage,
skip: i * postsPerPage,
numPages: numPages,
currentPage: i + 1
}
})
})
for (year in archives) {
const count = archives[year]
const numPagesOfYear = Math.ceil(count / postsPerPage)
Array.from({ length: numPagesOfYear }).forEach((_, i) => {
createPage({
path: i === 0 ? `/${year}/` : `/${year}/page/${i + 1}`,
component: blogListTemplate,
context: {
page: 'year',
archives,
recentPosts,
tags: tagsFrequent,
filter: {
frontmatter: { date: { gte: year, lt: year + 1 } },
fileAbsolutePath: { regex: '/content/blog/' }
},
limit: postsPerPage,
skip: i * postsPerPage,
numPages: numPagesOfYear,
currentPage: i + 1,
year: year
}
})
})
}
tags.forEach((tag) => {
const count = tag.totalCount
const numPagesOfTag = Math.ceil(count / postsPerPage)
Array.from({ length: numPagesOfTag }).forEach((__, i) => {
const tagPath = `/category/${_.kebabCase(tag.name)}/`
createPage({
path: i === 0 ? tagPath : `${tagPath}page/${i + 1}`,
component: blogListTemplate,
context: {
page: 'category',
archives,
recentPosts,
tags: tagsFrequent,
filter: { frontmatter: { tags: { in: [tag.name] } } },
limit: postsPerPage,
skip: i * postsPerPage,
numPages: numPagesOfTag,
currentPage: i + 1,
tag: tag.name
}
})
})
})
createPage({
path: '/search/',
component: blogSearchTemplate,
context: {
page: 'search',
archives,
recentPosts,
tags: tagsFrequent
}
})
}
exports.onCreatePage = ({ page, actions }) => {
const { createPage, deletePage } = actions
if (page.path == '/404/') {
deletePage(page)
createPage({
...page,
context: {
...page.context,
archives,
recentPosts,
tags: tagsFrequent
}
})
}
}
exports.onCreateNode = ({ node, actions, getNode }) => {
const { createNodeField } = actions
if (node.internal.type === `MarkdownRemark`) {
const value = createFilePath({ node, getNode })
createNodeField({
name: `slug`,
node,
value
})
}
}
exports.createSchemaCustomization = ({ actions }) => {
const { createTypes } = actions
// Explicitly define the siteMetadata {} object
// This way those will always be defined even if removed from gatsby-config.js
// Also explicitly define the Markdown frontmatter
// This way the "MarkdownRemark" queries will return `null` even when no
// blog posts are stored inside "content/blog" instead of returning an error
createTypes(`
type SiteSiteMetadata {
author: Author
siteUrl: String
social: Social
}
type Author {
name: String
summary: String
}
type Social {
twitter: String
}
type MarkdownRemark implements Node {
frontmatter: Frontmatter
fields: Fields
}
type Frontmatter {
title: String
date: Date @dateformat
author: String
authorLink: String
previewImg: File @fileByRelativePath
tags: [String!]!
}
type Fields {
slug: String
}
`)
}

22691
package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

54
package.json Normal file
View File

@ -0,0 +1,54 @@
{
"name": "data-controller-site",
"version": "1.0.0",
"private": true,
"description": "Data Controller Site",
"author": "Saad Jutt",
"keywords": [
"gatsby"
],
"scripts": {
"develop": "gatsby develop",
"start": "gatsby develop",
"build": "gatsby build",
"serve": "gatsby serve",
"clean": "gatsby clean",
"lint": "npx prettier --check \"src/**/*.+(ts|tsx|js|jsx|json|css|scss)\"",
"lint:fix": "npx prettier --write \"src/**/*.+(ts|tsx|js|jsx|json|css|scss)\" --ignore-path .gitignore"
},
"dependencies": {
"@browniebroke/gatsby-image-gallery": "^6.2.0",
"@mdx-js/mdx": "^1.6.22",
"@mdx-js/react": "^1.6.22",
"babel-plugin-styled-components": "^1.12.0",
"gatsby": "^3.5.1",
"gatsby-plugin-google-analytics": "^3.3.0",
"gatsby-plugin-google-fonts": "^1.0.1",
"gatsby-plugin-image": "^1.3.1",
"gatsby-plugin-local-search": "^2.0.1",
"gatsby-plugin-manifest": "^3.3.0",
"gatsby-plugin-matomo": "0.13.0",
"gatsby-plugin-react-helmet": "^4.3.0",
"gatsby-plugin-sharp": "^3.3.1",
"gatsby-plugin-sitemap": "^3.3.0",
"gatsby-plugin-styled-components": "^4.3.0",
"gatsby-remark-embed-video": "^3.1.1",
"gatsby-remark-images": "^4.2.0",
"gatsby-remark-responsive-iframe": "^4.2.1",
"gatsby-source-filesystem": "^3.3.0",
"gatsby-transformer-remark": "^4.0.0",
"gatsby-transformer-sharp": "^3.3.0",
"react": "^17.0.1",
"react-dom": "^17.0.1",
"react-helmet": "^6.1.0",
"react-icons": "^4.2.0",
"react-share": "^4.4.0",
"react-use-flexsearch": "^0.1.1",
"styled-components": "^5.2.3"
},
"devDependencies": {
"@popperjs/core": "^2.9.2",
"bootstrap": "^5.0.0-beta3",
"jquery": "^3.6.0"
}
}

View File

@ -0,0 +1,75 @@
import React from 'react'
import { Link } from 'gatsby'
// import dcLogo from "../../images/dclogo.png";
import { Container, Section } from '../shared'
import { SolidButton } from '../shared/styledComponents'
import { StyledHeading, StyledDesc, InputStyled, StyledAnchor } from './style'
const anchorStyles = {
color: '#888'
}
const Footer = () => (
<Section bottomArrow={false}>
<div className="row">
<div className="col-md-3 me-md-5">
<StyledHeading>Data Controller</StyledHeading>
<StyledDesc>
Data Controller is a product of 4GL Apps, a brand of Bowe IO Ltd,
which is a UK company with a focus on SAS Software,{' '}
<StyledAnchor href="https://sasapps.io">Apps</StyledAnchor>, and
Services.
</StyledDesc>
</div>
<div className="col-md-3">
<StyledHeading>Newsletter</StyledHeading>
<form
className="kwes-form"
method="POST"
action="https://kwes.io/api/foreign/forms/mxKuyK4lxZWnG2WNH3ga"
>
<div className="mb-3">
<InputStyled
type="email"
name="email"
className="form-control"
aria-describedby="emailHelp"
placeholder="Email Address*"
required
/>
</div>
<div className="mb-3">
<InputStyled
type="text"
name="name"
className="form-control"
placeholder="First Name"
required
/>
</div>
<div className="mb-3">
<InputStyled
type="text"
name="lastName"
className="form-control"
placeholder="Last Name"
/>
</div>
<SolidButton>Subscribe</SolidButton>
</form>
</div>
<div className="col-md-3">
<StyledHeading>Other Resources</StyledHeading>
<StyledDesc>
Visit our educational and fun SAS® software quiz{' '}
<StyledAnchor href="https://sasensei.com">Sasensei</StyledAnchor> and
test your knowledge of SAS topics.
</StyledDesc>
</div>
</div>
</Section>
)
export default Footer

View File

@ -0,0 +1,40 @@
import React from 'react'
import styled from 'styled-components'
export const StyledHeading = styled.h6`
margin-bottom: 0.8rem;
text-transform: uppercase;
color: #888;
`
export const StyledDesc = styled.p`
color: #aaaaaa;
font-size: 0.9rem;
`
export const InputStyled = styled.input`
background: transparent;
border: none;
outline: none;
font-size: 0.9rem;
&:focus {
color: inherit;
background: transparent;
border: none;
outline: none;
box-shadow: none;
}
`
const Anchor = styled.a`
color: #d4d4d4;
text-decoration: none;
&:hover {
color: white;
}
`
export const StyledAnchor = ({ children, href }) => (
<Anchor href={href} target="_blank" rel="noopener">
{children}
</Anchor>
)

View File

@ -0,0 +1,37 @@
import React from 'react'
import { PageProps, Link } from 'gatsby'
import styled from 'styled-components'
import { Hero, HeroHeading, HeroDesc } from './style'
import { BottomSectionArrow, OutlineButton } from '../shared/styledComponents'
import { Container } from '../shared'
import { pathPrefix } from '../../../gatsby-config.js'
type DataProps = {
location: Location
heading: string
desc: string
}
const HeroSection: React.FC<PageProps<DataProps>> = ({
location,
heading,
desc
}) => (
<Hero bg={location.pathname === pathPrefix + '/'}>
<Container>
<HeroHeading>{heading}</HeroHeading>
<HeroDesc>{desc}</HeroDesc>
{location.pathname === pathPrefix + '/' && (
<Link to="/contact/">
<OutlineButton>Try Data Controller</OutlineButton>
</Link>
)}
</Container>
<BottomSectionArrow />
</Hero>
)
export default HeroSection

View File

@ -0,0 +1,23 @@
import styled from 'styled-components'
import background from '../../images/home_hero_bg.png'
export const Hero = styled.main`
position: relative;
padding: 50px 0;
color: white;
background-color: #314351;
background-repeat: no-repeat;
background-image: ${(props) => (props.bg ? `url(${background})` : 'none')};
background-attachment: scroll;
background-position: bottom right;
`
export const HeroHeading = styled.h1`
font-family: 'Montserrat', 'HelveticaNeue', 'Helvetica Neue', Helvetica, Arial,
sans-serif;
text-transform: uppercase;
`
export const HeroDesc = styled.p`
opacity: 0.8;
`

41
src/components/layout.tsx Normal file
View File

@ -0,0 +1,41 @@
import React, { useEffect } from 'react'
import { PageProps } from 'gatsby'
import Navibar from './navibar'
import HeroSection from './herosection'
import Footer from './footer'
type DataProps = {
children?: React.ReactNode
heroSection: boolean
}
const Layout: React.FC<PageProps<DataProps>> = ({
location,
children,
heroSection = true,
heading,
desc
}) => {
useEffect(() => {
if (document.querySelector('script[data-name="kwes-script"]')) return
const kwesScript = document.createElement('script')
kwesScript.setAttribute('rel', 'noopener')
kwesScript.setAttribute('src', 'https://kwes.io/v2/kwes-script.js')
kwesScript.setAttribute('data-name', 'kwes-script')
document.head.appendChild(kwesScript)
})
return (
<>
<Navibar location={location} />
{heroSection && (
<HeroSection location={location} heading={heading} desc={desc} />
)}
{children}
<Footer />
</>
)
}
export default Layout

View File

@ -0,0 +1,105 @@
import React from 'react'
import { Link, PageProps } from 'gatsby'
import dcLogo from '../../images/dclogo.png'
import { Container } from '../shared'
import { logoStyles, CustomNavBar, ulStyles, Li, StyledLink } from './style'
import { pathPrefix } from '../../../gatsby-config.js'
const naviLinks = [
{
name: 'Home',
url: '/',
active: 'no'
},
{
name: 'About',
url: '/about/',
active: 'no'
},
{
name: 'Blog',
url: '/blog/',
active: 'no'
},
{
name: 'FAQ',
url: '/faq/',
active: 'no'
},
{
name: 'Documentation',
url: 'https://docs.datacontroller.io/',
active: 'no'
},
{
name: 'Pricing',
url: '/pricing/',
active: 'no'
},
{
name: 'Book Demo',
url: '/contact/',
active: 'no'
},
{
name: 'Source Code',
url: 'https://git.datacontroller.io/dc/dc',
active: 'no'
}
]
type DataProps = {
location: Location
}
const Navibar: React.FC<PageProps<DataProps>> = ({ location }) => {
naviLinks.forEach((link) => (link.active = 'no'))
const currentLink = naviLinks.find(
(link) => pathPrefix + link.url === location?.pathname
)
if (currentLink) currentLink.active = 'yes'
return (
<CustomNavBar className="navbar navbar-expand-lg">
<Container>
<Link to="/">
<img src={dcLogo} style={logoStyles} alt="Data Controller Logo" />
</Link>
<button
className="navbar-toggler collapsed"
type="button"
data-bs-toggle="collapse"
data-bs-target="#navbarSupportedContent"
aria-controls="navbarSupportedContent"
aria-expanded="false"
aria-label="Toggle navigation"
>
<div className="navbar-toggler-icon" id="nav-icon4">
<span></span>
<span></span>
<span></span>
</div>
</button>
<div className="collapse navbar-collapse" id="navbarSupportedContent">
<ul className="navbar-nav mb-2 mb-lg-0" style={ulStyles}>
{naviLinks.map((link, index) => (
<Li key={index} className="nav-item">
<StyledLink
to={link.url}
className="nav-link"
active={link.active}
>
{link.name}
</StyledLink>
</Li>
))}
</ul>
</div>
</Container>
</CustomNavBar>
)
}
export default Navibar

View File

@ -0,0 +1,333 @@
import React from 'react'
import styled, { css } from 'styled-components'
import { Link } from 'gatsby'
// styles
export const logoStyles = {
height: '55px'
}
export const ulStyles = {
marginLeft: 'auto'
}
export const CustomNavBar = styled.nav`
padding: 0;
background-color: #314351;
color: white;
font-size: 0.85rem;
/* Icon 1 */
#nav-icon1,
#nav-icon2,
#nav-icon3,
#nav-icon4 {
width: 60px;
height: 45px;
position: relative;
margin: 10px auto;
-webkit-transform: rotate(0deg);
-moz-transform: rotate(0deg);
-o-transform: rotate(0deg);
transform: rotate(0deg);
-webkit-transition: 0.5s ease-in-out;
-moz-transition: 0.5s ease-in-out;
-o-transition: 0.5s ease-in-out;
transition: 0.5s ease-in-out;
cursor: pointer;
}
#nav-icon1 span,
#nav-icon3 span,
#nav-icon4 span {
display: block;
position: absolute;
height: 9px;
width: 100%;
background: #79a843;
border-radius: 9px;
opacity: 1;
left: 0;
-webkit-transform: rotate(0deg);
-moz-transform: rotate(0deg);
-o-transform: rotate(0deg);
transform: rotate(0deg);
-webkit-transition: 0.25s ease-in-out;
-moz-transition: 0.25s ease-in-out;
-o-transition: 0.25s ease-in-out;
transition: 0.25s ease-in-out;
}
#nav-icon1 span:nth-child(1) {
top: 0px;
}
#nav-icon1 span:nth-child(2) {
top: 18px;
}
#nav-icon1 span:nth-child(3) {
top: 36px;
}
.navbar-toggler:not(.collapsed) #nav-icon1 span:nth-child(1) {
top: 18px;
-webkit-transform: rotate(135deg);
-moz-transform: rotate(135deg);
-o-transform: rotate(135deg);
transform: rotate(135deg);
}
.navbar-toggler:not(.collapsed) #nav-icon1 span:nth-child(2) {
opacity: 0;
left: -60px;
}
.navbar-toggler:not(.collapsed) #nav-icon1 span:nth-child(3) {
top: 18px;
-webkit-transform: rotate(-135deg);
-moz-transform: rotate(-135deg);
-o-transform: rotate(-135deg);
transform: rotate(-135deg);
}
/* Icon 2 */
#nav-icon2 {
}
#nav-icon2 span {
display: block;
position: absolute;
height: 9px;
width: 50%;
background: #d3531a;
opacity: 1;
-webkit-transform: rotate(0deg);
-moz-transform: rotate(0deg);
-o-transform: rotate(0deg);
transform: rotate(0deg);
-webkit-transition: 0.25s ease-in-out;
-moz-transition: 0.25s ease-in-out;
-o-transition: 0.25s ease-in-out;
transition: 0.25s ease-in-out;
}
#nav-icon2 span:nth-child(even) {
left: 50%;
border-radius: 0 9px 9px 0;
}
#nav-icon2 span:nth-child(odd) {
left: 0px;
border-radius: 9px 0 0 9px;
}
#nav-icon2 span:nth-child(1),
#nav-icon2 span:nth-child(2) {
top: 0px;
}
#nav-icon2 span:nth-child(3),
#nav-icon2 span:nth-child(4) {
top: 18px;
}
#nav-icon2 span:nth-child(5),
#nav-icon2 span:nth-child(6) {
top: 36px;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(1),
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(6) {
-webkit-transform: rotate(45deg);
-moz-transform: rotate(45deg);
-o-transform: rotate(45deg);
transform: rotate(45deg);
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(2),
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(5) {
-webkit-transform: rotate(-45deg);
-moz-transform: rotate(-45deg);
-o-transform: rotate(-45deg);
transform: rotate(-45deg);
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(1) {
left: 5px;
top: 7px;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(2) {
left: calc(50% - 5px);
top: 7px;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(3) {
left: -50%;
opacity: 0;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(4) {
left: 100%;
opacity: 0;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(5) {
left: 5px;
top: 29px;
}
.navbar-toggler:not(.collapsed) #nav-icon2 span:nth-child(6) {
left: calc(50% - 5px);
top: 29px;
}
/* Icon 3 */
#nav-icon3 span:nth-child(1) {
top: 0px;
}
#nav-icon3 span:nth-child(2),
#nav-icon3 span:nth-child(3) {
top: 18px;
}
#nav-icon3 span:nth-child(4) {
top: 36px;
}
.navbar-toggler:not(.collapsed) #nav-icon3 span:nth-child(1) {
top: 18px;
width: 0%;
left: 50%;
}
.navbar-toggler:not(.collapsed) #nav-icon3 span:nth-child(2) {
-webkit-transform: rotate(45deg);
-moz-transform: rotate(45deg);
-o-transform: rotate(45deg);
transform: rotate(45deg);
}
.navbar-toggler:not(.collapsed) #nav-icon3 span:nth-child(3) {
-webkit-transform: rotate(-45deg);
-moz-transform: rotate(-45deg);
-o-transform: rotate(-45deg);
transform: rotate(-45deg);
}
.navbar-toggler:not(.collapsed) #nav-icon3 span:nth-child(4) {
top: 18px;
width: 0%;
left: 50%;
}
/* Icon 4 */
#nav-icon4 {
}
#nav-icon4 span:nth-child(1) {
top: 0px;
-webkit-transform-origin: left center;
-moz-transform-origin: left center;
-o-transform-origin: left center;
transform-origin: left center;
}
#nav-icon4 span:nth-child(2) {
top: 18px;
-webkit-transform-origin: left center;
-moz-transform-origin: left center;
-o-transform-origin: left center;
transform-origin: left center;
}
#nav-icon4 span:nth-child(3) {
top: 36px;
-webkit-transform-origin: left center;
-moz-transform-origin: left center;
-o-transform-origin: left center;
transform-origin: left center;
}
.navbar-toggler:not(.collapsed) #nav-icon4 span:nth-child(1) {
-webkit-transform: rotate(45deg);
-moz-transform: rotate(45deg);
-o-transform: rotate(45deg);
transform: rotate(45deg);
top: -3px;
left: 8px;
}
.navbar-toggler:not(.collapsed) #nav-icon4 span:nth-child(2) {
width: 0%;
opacity: 0;
}
.navbar-toggler:not(.collapsed) #nav-icon4 span:nth-child(3) {
-webkit-transform: rotate(-45deg);
-moz-transform: rotate(-45deg);
-o-transform: rotate(-45deg);
transform: rotate(-45deg);
top: 39px;
left: 8px;
}
.navbar-toggler {
padding: 0;
&:focus {
box-shadow: none;
}
}
`
const LinkUnderlineStyles = css`
content: ' ';
position: absolute;
width: calc(100% - 1.6rem);
height: 2px;
bottom: 0;
background: white;
opacity: 1;
transition: opacity 0.3s ease;
`
// styled components
export const Li = styled.li`
position: relative;
@media (min-width: 992px) {
&:after {
content: ' ';
position: absolute;
width: 1px;
height: 50%;
top: 25%;
right: 0;
background: white;
}
&:nth-last-child(1) {
&:after {
display: none;
}
}
}
`
export const StyledLink = styled((props) => <Link {...props} />)`
padding-right: 0.8rem !important;
padding-left: 0.8rem !important;
color: white;
&:before {
${LinkUnderlineStyles}
opacity: ${(props) => (props.active === 'yes' ? '1' : 0)};
}
&:hover {
color: white;
&:before {
${LinkUnderlineStyles}
}
}
`

80
src/components/seo.tsx Normal file
View File

@ -0,0 +1,80 @@
import * as React from 'react'
import PropTypes from 'prop-types'
import { Helmet } from 'react-helmet'
import { useStaticQuery, graphql } from 'gatsby'
const Seo = ({ description, lang, meta, title, previewImg = undefined }) => {
const { site } = useStaticQuery(graphql`
query {
site {
siteMetadata {
title
description
siteUrl
author {
name
}
social {
linkedin
}
}
}
}
`)
const author = site.siteMetadata?.author?.name
const metaDescription = description || site.siteMetadata.description
const defaultTitle = site.siteMetadata?.title
const pageTitle = title ? `${title} | ${defaultTitle}` : defaultTitle
const siteUrl = site.siteMetadata?.siteUrl
const image = previewImg
? `${siteUrl}${previewImg}`
: `${siteUrl}/img/data-controller.svg`
return (
<Helmet
htmlAttributes={{
lang
}}
title={pageTitle}
meta={[
{ name: 'author', property: 'author', content: author },
{
name: 'description',
property: 'og:description',
content: metaDescription
},
// { name: 'facebook:site', content: '', },
{ name: 'image', property: 'og:image', content: image },
{
name: `linkedin:site`,
content: site.siteMetadata?.social?.linkedin || ``
},
{ name: `twitter:card`, content: `summary` },
// { name: `twitter:creator`, content: site.siteMetadata?.social?.twitter || `` },
{ name: `twitter:description`, content: metaDescription },
// { name: 'twitter:site', content: `${site?.twitter}`, },
{ name: `twitter:title`, content: title },
// { name: 'youtube:site', content: `${site?.youtube}`, },
{ property: `og:title`, content: title },
{ property: `og:type`, content: `website` }
].concat(meta)}
/>
)
}
Seo.defaultProps = {
lang: `en`,
meta: [],
description: ``,
title: ``
}
Seo.propTypes = {
description: PropTypes.string,
lang: PropTypes.string,
meta: PropTypes.arrayOf(PropTypes.object),
title: PropTypes.string
}
export default Seo

View File

@ -0,0 +1,18 @@
import React from 'react'
import styled from 'styled-components'
import { PageProps } from 'gatsby'
type DataProps = {
children?: React.ReactNode
}
const StyledDiv = styled.div`
@media (min-width: 576px) {
max-width: 1310px;
padding: 0px 50px;
}
`
export const Container: React.FC<PageProps<DataProps>> = ({ children }) => {
return <StyledDiv className="container">{children}</StyledDiv>
}

View File

@ -0,0 +1,3 @@
export { Container } from './container'
export { Section } from './section'
export { ScheduleDemo } from './scheduleDemo'

View File

@ -0,0 +1,53 @@
import React from 'react'
import { Link } from 'gatsby'
import styled from 'styled-components'
import { FaEnvelope } from 'react-icons/fa'
export const StyledLink = styled((props) => <Link {...props} />)`
color: rgb(255, 255, 255);
background-color: rgb(144, 196, 69);
border-radius: 0px;
padding: 50px 10px;
width: 100%;
margin: 0px;
border: none;
position: relative;
display: block;
text-decoration: none;
font-size: 1.3rem;
line-height: 1.2em;
text-align: center;
max-width: 100%;
font-family: Montserrat, HelveticaNeue, 'Helvetica Neue', Helvetica, Arial;
svg {
opacity: 0;
transition: all 0.5s ease;
}
&:hover {
color: white;
background-color: #314351;
svg {
opacity: 1;
}
}
transition: all 0.3s ease;
`
const iconStyles = { marginTop: '-2px', marginLeft: '5px' }
const textStyles = { opacity: '0.7', fontSize: '1rem', margin: '12px auto 0' }
export const ScheduleDemo = () => {
return (
<StyledLink to="/contact">
<span>
Schedule a Free Demo <FaEnvelope size={18} style={iconStyles} />
</span>
<p style={textStyles}>
Contact us for a free demonstration of Data Controller.
</p>
</StyledLink>
)
}

View File

@ -0,0 +1,34 @@
import React from 'react'
import styled from 'styled-components'
import { PageProps } from 'gatsby'
import { BottomSectionArrow } from './styledComponents'
import { Container } from './'
type DataProps = {
children?: React.ReactNode
color?: string
bgColor?: string
bottomArrow?: boolean
}
const StyledSection = styled.div`
position: relative;
padding: 50px 0;
color: ${(props) => props.color || 'white'};
background-color: ${(props) => props.bgColor || '#314351'};
`
export const Section: React.FC<PageProps<DataProps>> = ({
children,
bgColor,
color,
bottomArrow = true
}) => {
return (
<StyledSection bgColor={bgColor} color={color}>
<Container>{children}</Container>
{bottomArrow && <BottomSectionArrow />}
</StyledSection>
)
}

View File

@ -0,0 +1,88 @@
import React from 'react'
import styled from 'styled-components'
const BottomArrow = styled.div`
width: 50px;
height: 50px;
position: absolute;
bottom: 10px;
background: inherit;
transform: translateX(-50%) rotate(45deg);
left: 50%;
// right: 0;
// margin-left: auto;
// margin-right: auto;
z-index: 10;
`
const BottomArrowWrapper = styled.div`
width: 100%;
height: 20px;
position: absolute;
overflow: hidden;
margin-top: 50px;
background-color: inherit;
`
export const BottomSectionArrow = () => (
<BottomArrowWrapper>
<BottomArrow />
</BottomArrowWrapper>
)
export const SectionHeading = styled.h2`
text-align: ${(props) => (props.center === 'no' ? 'left' : 'center')};
letter-spacing: 1px;
font-weight: 400;
font-family: 'Montserrat', 'HelveticaNeue', 'Helvetica Neue', Helvetica, Arial,
sans-serif;
text-transform: uppercase;
`
export const SectionDesc = styled.p`
text-align: ${(props) => (props.center === 'no' ? 'left' : 'center')};
opacity: ${(props) => props.opacity ?? 0.6};
a {
color: inherit;
}
`
const StyledSolidButton = styled.button`
padding: 0.75rem 1.5rem;
font-size: 0.75rem;
border-width: 2px;
width: 100%;
max-width: 250px;
&:hover {
opacity: 0.9;
}
&.btn-dark {
background-color: #2e4252;
}
`
export const SolidButton = ({
children,
theme = 'light',
type = 'submit',
onClick = undefined
}) => (
<StyledSolidButton
type={type}
className={`btn btn-${theme}`}
onClick={onClick}
>
{children}
</StyledSolidButton>
)
const StyledOutlineButton = styled.button`
margin: 50px 0;
padding: 0.75rem 1.5rem;
font-size: 0.75rem;
border-width: 2px;
`
export const OutlineButton = ({ children }) => (
<StyledOutlineButton type="button" className="btn btn-outline-light">
{children}
</StyledOutlineButton>
)

BIN
src/images/contact_bg.jpg Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 101 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 113 KiB

BIN
src/images/dc-software.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

BIN
src/images/dclogo.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.8 KiB

BIN
src/images/favicon.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 34 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 134 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 212 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 222 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 218 KiB

Some files were not shown because too many files have changed in this diff Show More