Skyone
Skyone
English
English
  • Home
  • Data
    • Getting Started
      • Create an account
      • Recover Password
      • Quick Platform Guide
      • How to test the platform for free
      • Workspace
        • Creating a new Workspace
        • Find a Workspace
        • Sending an invitation to a Workspace
        • Editing a Workspace
      • Organizations
        • Creating an Organization
        • Organization Overview
        • Organization Management
        • Organization Monitoring
      • Settings and Preferences
        • Profile
        • Notifications
        • Usage and Billing
        • Users and Permissions
    • Modules
      • Module management
        • Creating a Module
        • Importing a Module
          • IAC Files - Integration as Code
        • Editing a module
        • Module Options
      • Settings and Operations
        • Module settings
          • Connectivity: Database
          • Connectivity: Email
          • Connectivity: REST
          • Connectivity: SOAP
          • Connectivity: File
          • Connectivity: RFC
          • Connected Account Management
        • Operations
          • Importing operations into REST Modules
          • Operation Management
        • Flows Using This Module
    • Monitoring
    • API Gateway
    • Terminals & Agent
      • Agent
        • Versions supported by Agent
        • How to Update the Agent Version
        • How to back up Agent files
      • Terminals
    • Data
      • Data Stack
        • Process Control
        • Data Stack Upload
        • File Actions
        • File Jobs
        • Data Job Parameters
        • Data Store
        • Data Share Features
        • ODBC
        • How to use the Data Engine Proxy
    • Integrations
      • Integration Management
        • Create integration
        • Import Integration
        • Edit Integration
        • Integration Options
        • Flows of this integration
      • Flows
        • Flow management
          • Creating a flow
          • Flow options
          • Flow Canva: configuring and editing the flow
            • Flow Canva: overview
            • Exception Handler
              • Exception Handler - Configuration
              • Exception Handler - Cases
            • Multicontext Flows
              • Example: Multicontext with an API Gateway
              • Example: Multicontext with a Time Trigger
            • Flow Settings
        • Triggers of a flow
          • API Gateway Triggers: Adding and Setting
          • AS2 Triggers: Adding and Setting
          • Queue Triggers: Adding and Setting
          • Flow Triggers: Adding and Setting
          • Time Triggers: Adding and Setting
          • Webhook Triggers: Adding and Setting
        • Tool Modules
          • AS2 Module
          • CSV Module
          • Data Transform Module
          • Data Balancer Module
          • EDI Module
          • Flow Call Module
          • IF Module
          • JavaScript Module
          • Log Module
          • Loop Do While Module
          • Loop For Module
          • Return Module
          • XML Module
          • Other Tool Modules
        • Module Header
        • Connecting components of a flow
        • Editing triggers and modules
        • Data Operations
          • Object Handling
            • Practical example: Handling variables
          • SMOP (Small Operations)
          • Parameterization rules
    • How to
      • Insert JSON into databases
      • Flattening: Data transformation using JSONata
      • How to use Form Data
      • Understanding recursion in JSONata
      • REST Module Output Consolidation
      • Isolated in execution: concept and application in variables
      • URL Parameters in API Gateway
      • Use case: API Gateway trigger parameters
      • Use case: Exception Handler in financial transactions
      • Use case: using Groups to manage access to flows
      • How to create a download endpoint and integrate with Power BI
      • Is it possible to use two triggers in a single flow?
      • How to set up WhatsApp in Skyone Studio
    • FAQ
    • GIGS: The complete guide
    • Glossary
  • Support
    • How do I request support?
    • Case Severity Levels
    • SLAs
    • Help & Resources
Powered by GitBook
On this page
  • How to Access
  • Basic Information
  • Metrics
  • External storages
  • Data Consumers
  1. Data

Data

PreviousTerminalsNextData Stack

Last updated 6 months ago

In “Data” you can manage Data Engines, add external storages and configure jobs.

If you don't have a Data Engine, you can request one via the "Request Data Engine" button.

How to Access

Go to "Data" from the left sidebar. Once you open the menu, you will see the Data Engines available for the space, each displaying the following information:

  • Data Engine Name

  • Total Disk: available storage space

  • Total Memory: processing capacity

  • CPU Cores: data processing cores

When you click on a Data Engine, the following screen will appear:

Where:

Basic Information

In the first section, you can view the Data Engine name and description (if available) and use the shortcut to Data Studio.

Metrics

Monitor the Data Engine metrics:

  • Memory: Percentage of machine memory usage.

  • Disk: Percentage of disk used.

  • CPU: Percentage of CPU usage. This value can vary significantly.

  • Total memory: Amount of contracted memory.

  • Total disk: Amount of contracted disk.

  • CPU cores: Number of contracted cores.

  • View processes: If there are any processes running (e.g., SQL), they will be listed here. Additionally, it is also possible to cancel any process.

External storages

You can view all the external storages registered in the Data Engine. Additionally, you can create, edit, and delete a storage.

Create external storage

To create an external storage, follow these steps:

  1. Click "Create new external storage".

  2. Choose the type of storage and fill in the fields according to the selected type. The types of storage are:

AWS S3

To create a new external storage of type AWS S3, fill in the following fields:

  • Storage name: Enter a name to identify the storage.

  • Access Key ID: Insert your AWS Access Key ID.

  • Secret access key: Insert your AWS Secret Access Key.

  • Storage region: Select the region where your S3 bucket is located.

  • Endpoint (optional): Custom endpoint for reading the file within Data Studio.

  • Storage URL (optional): Custom URL for connecting to S3.

Google Cloud Storage

To create a new external storage of type Google Cloud Storage, fill in the following fields:

  • Storage name: Enter a name to identify the storage.

  • Service Account: Enter the associated service account.

Azure

To create a new external storage of type Azure, fill in the following fields:

  • Storage name: Enter a name to identify the storage.

  • Service type: Choose the type of service—Blob Storage or Data Lake Storage.

  • Connection string: Enter the Azure connection string.

Custom Storage

To create a new external storage of type Custom Storage, fill in the following fields:

  • Storage name: Enter a name to identify the storage.

  • Access Key ID: Enter the Access Key ID.

  • Secret access key: Enter the Secret Access Key.

  • Storage region: Enter the region where the storage is located.

  • Endpoint: Endpoint used for reading files within Data Studio.

  • Storage URL: URL of the storage used for authentication.

ℹ️ Tested external storages: localstack and clouds2africa.

  1. To finalize, click "Create".

Edit external storage

To edit external storage, follow these steps:

  1. In the storage list, click "Edit" (pencil icon) on the item you want to modify.

  2. Make the desired changes.

You cannot change the storage type and name.

  1. To finish, click "Save changes".

Delete external storage

To delete external storage, follow these steps:

  1. In the storage list, click "Delete" (trash can icon) on the item you want to remove.

  2. In the modal window that appears, confirm the deletion of the external storage by typing the storage name and then clicking "Delete".

The action cannot be undone.

You can also delete an external storage via the "Edit" option.

Data Consumers

Follow the steps below to create, edit, and delete a data consumer:

Create Consumer

To create a data consumer, follow these steps:

  1. Click "Create consumer".

  2. Note that the user will be named user@engine_name. Add the user’s name.

  3. By default, the authentication type is "Basic".

  4. Add a password.

  5. To finish, click "Save user".

Done! The data consumer has been created!

Edit Consumer

To edit a data consumer, follow these steps:

  1. In "Data Consumers", click "Edit" (pencil icon) on the name you want to modify.

  2. Make the desired changes.

You cannot change the user’s name.

  1. To finish, click "Save changes".

Delete Consumer

To delete a data consumer, follow these steps:

  1. In "Data Consumers", click "Delete" (trash can icon) on the name you want to remove.

  2. In the modal window that appears, confirm the deletion of the user by typing the user’s name and then clicking "Delete".

The action cannot be undone.

You can also delete a user via the "Edit" option.

Access Key ID: Access ID generated by the , used for reading files.

Secret access key: Access key generated by the , used for reading files.

When creating a “file template” (read more in the article), you need to configure which users will have access to the endpoint. These users are called "Data Consumers".

HMAC Key
HMAC Key
Data Shared Features
"Data” menu screen when there are Data Engines