Ascend Glossary

This section provides a list of commonly used terms and concepts within the Ascend platform

Ascend Glossary

A
AnalyzingA component state which indicates that Ascend Platform is determining the optimal data processing strategy
Ascend DataflowA single data workflow (DAG or Directed Acyclic Graph) defined on the Ascend Platform consisting of Connectors, Transforms, and Data Feeds
Ascend PlatformAn Autonomous Dataflow Platform that enables users to self-serve and iterate on data in a matter of minutes, not weeks. With the platform, users can discover and trace back existing Dataflows, self-serve as you build, iterate and enrich Dataflows, collaborate with your team and other orgs to share and reuse Dataflows.
Ascend DocumentationAn online documentation that includes the query expressions supported by the Ascend Platform and resources such as tutorial
Ascend System DashboardAn overview of all component types and status across the system that the User has access to
B
Business LogicLogic that is used to define data transformation in an Ascend Dataflow, usually as SQL or PySpark within Ascend Transforms, code in Parser Functions, and Custom Source Functions
Business Requirements Document (BRD)A formal document for capturing business requirements for an Ascend Data Service
C
Clean TransformA Transform on the Ascend Platform that provides clean canonical data by performing data cleansing, deduplication, and other normalization; usually the first Transform created for each Read Connector
ComponentAny object on the Ascend Platform, such as a Read or Write Connector, Transform, or Data Feed.
Component GroupingMultiple components on the Ascend Platform that are grouped together
Context MenuA drop-down menu accessed by right clicking a component on the Ascend Dataflow
D
Data AdminAscend users who has full access to all Dataflows and their data within a Data Service
Data FeedA mechanism in Ascend by which Dataflows and Data Services can communicate live data with each other
DataflowUnlike point-to-point pipelines, where you can only query at the static endpoints, Dataflows understand that data is inherently connected and dynamically changing. With Dataflows, you can analyze, iterate, and reuse data and logic at any stage, and always trust that the data remains live and up-to-date.
Development DataA subset of Production Data suitable for testing and developing a Dataflow
Data ServiceThe highest level object in the Ascend hierarchy; a Data Service contains Teams, Users, and Dataflows
E
EnvironmentAn environment is a unique deployment of the Ascend system
ExportThe process of retrieving the JSON representation of a collection of components from a Dataflow
EveryoneThe default Team that does not have any active permissions besides access to the Data Service
F
Full reductionA scenario where a single output partition is produced from all input partitions
G
H
I
ImportThe process of adding a collection of components represented in JSON to a Dataflow
IngestionThe process of importing data into the Ascend Platform for processing and storage
IntercomThe in-application messaging interface for Users to directly chat with Ascend customer support
J
K
L
ListingA component state indicating that the Ascend platform is checking for unprocessed files in the designated data location
M
MaintenanceThe process of updating or fixing bugs in a Dataflow
MappingA scenario where each output partition is produced by exactly one input partition
Materialized ViewThe result of a query that gets stored in the Ascend Platform; All Transforms in Ascend are materialized
MigrationThe process of upgrading from a Development Dataflow to a Production Dataflow
N
O
Out-Of-DateA component state which indicates that the Ascend Platform has discovered new work that needs to be completed
P
Parser FunctionFunctions that allow users to embed custom code to expand support for custom formats
ParsingA component state in which the Ascend Platform is transforming the date files into rows of records
PartitionA single logical chunk of data processed by Ascend; can be materialized as a single data fragment, a table or file in the Ascend backend, a file in a Read Connector, or a file in a Write Connector
PartitioningA data partitioning scenario in which each output partition is produced with data from one or more input partitions
PermissionsRules for allowing or denying access to a Data Service or Dataflow
Production DataThe entire dataset currently available for ingestion that the production Dataflow requires
Production DataflowA stable Ascend Dataflow in which fully developed Dataflows are running in production mode, and integrated with production systems upstream and downstream
Q
R
Raw BuilderAn Ascend query builder that supports SQL syntax highlighting, comprehensive auto-complete, and code formatting
Read ConnectorAn Ascend component that pulls data from the upstream storage location into the Ascend Platform
Read Connector Update IntervalAn Ascend parameter that controls the frequency of the system to check Read Connectors for updates; Users can set this parameter on the UI

|
| | | Reading | A component state in which the Ascend Platform is reading in the source data |
| | | Reduction | A data partitioning scenario that reduces the number of partitions associated with a Transform |
| | | Reshaping | A component state which indicates that the Ascend Platform is modifying the internal data storage format for optimal processing |
| | | Running | A component state in which the Ascend Platform is processing work |
| S | | | |
| | | Smart Partitioning | A feature in Ascend that can automatically put data in different buckets based on a data field of Timestamp type in the Group By clause | |
| | | SQL Builder | A query building tool in Ascend that has an auto-completion feature with built-in clauses and keywords to assist user in query building |
| | | Staging | A component state which indicates that the Ascend Platform is persisting data for optimal loading performance |
| | | Super Admin | Ascend user who is both Data Admin and User Admin |
| | | Sweep Task | An internal task that automatically deletes all outdated files in the file Write Connectors |
| | | Sweeping | A component state which indicates that the Ascend Platform is removing outdated data |
| T | | | |
| | | Team | A group of Users in Ascend that share the same set of permissions and access |
| | | Transform | Data transformation specified in SQL or PySpark in a Dataflow |
| U | | | |
| | | Up-To-Date | A component state that indicates the data in the component is fully processed and internally consistent with other data in the Dataflow |
| | | User | An invidual utilizing the Ascend platform |
| | | User Admin | Ascend User responsible for user management within a Data Service |
| V | | | |
| | | Validation | The process of confirming that the results of a Dataflow conform to the documented business requirements |
| | | Version Control | The system for tracking changes to a file, typically applicable to Ascend JSON exports |
| W | | | |
| | | Write Connector | A data integration point that pushes data from the Ascend Platform to a downstream storage location |
| | | Workspace | A working area where Users can pin reference components to look at side by side while the User is working on another component |
| | | Writing | A component state that indicates the Ascend Platform is writing data to the specified downstream location |
| X | | | |
| Y | | | |
| Z | | | |