Tellius
  • ๐ŸšฉGetting Started
    • ๐Ÿ‘‹Say Hello to Tellius
      • Glossary
      • Tellius 101
      • Navigating around Tellius
      • Guided tours for quick onboarding
    • โšกQuick Start Guides
      • Search
      • Vizpads (Explore)
      • Insights (Discover)
    • โœ…Best Practices
      • Search
      • Vizpads (Explore)
      • Insights (Discover)
      • Predict
      • Data
    • โฌ‡๏ธInitial Setup
      • Tellius architecture
      • System requirements
      • Installation steps for Tellius
      • Customizing Tellius
    • Universal Search
    • ๐Ÿ Tellius Home Page
  • Kaiya
    • โ™Ÿ๏ธUnderstanding AI Agents & Agentic Flows
      • Glossary
      • Composer
      • ๐Ÿ—๏ธTriggering an agentic workflow
      • The art of possible
      • Setting up LLM for Kaiya
    • ๐ŸคนKaiya conversational AI
      • โ“FAQs on Kaiya Conversations
      • Triggering Insights with "Why" questions
      • Mastering Kaiya conversational AI
  • ๐Ÿ”Search
    • ๐Ÿ‘‹Get familiar with our Search interface
    • ๐Ÿค”Understanding Tellius Search
    • ๐Ÿ“Search Guide
    • ๐Ÿš€Executing a search query
      • Selecting a Business View
      • Typing a search query
      • Constructing effective search queries
      • Marketshare queries
    • ๐Ÿ”‘Analyzing search results
      • Understanding search results
      • Search Inspector
      • Time taken to execute a query
      • Interacting with the resulting chart
    • ๐Ÿ“ŠKnow your charts in Tellius
      • Understanding Tellius charts
      • Variations of a chart type
      • Building charts from Configuration pane
      • List of chart-specific fields
      • Adding columns to fields in Configuration pane
      • Absolute and percentage change aggregations
      • Requirements of charts
      • Switching to another chart
      • Formatting charts
      • Advanced Analytics
      • Cumulative line chart
    • ๐Ÿง‘โ€๐ŸซHelp Tellius learn
    • ๐Ÿ•ต๏ธโ€โ™‚๏ธSearch history
    • ๐ŸŽ™๏ธVoice-driven search
    • ๐Ÿ”ดLive Query mode
  • ๐Ÿ“ˆVizpads (Explore)
    • ๐Ÿ™‹Meet Vizpads!
    • ๐Ÿ‘‹Get familiar with our Vizpads
    • #๏ธโƒฃMeasures, dimensions, date columns
    • โœจCreating Vizpads
    • ๐ŸŒApplying global filters
      • Filters in multi-BV Vizpads
      • Filters using common columns
    • ๐Ÿ“ŒApplying local filters
    • ๐Ÿ“…Date picker in filters
      • Customizing the calendar view
    • โœ…Control filters
      • Multi-select list
      • Single-select list
      • Range slider
      • Dropdown list
    • ๐Ÿ‘๏ธActions in View mode
      • Interacting with the charts
    • ๐Ÿ“Actions in Edit mode
      • ๐Ÿ—จ๏ธViz-level actions
    • ๐Ÿ”งAnomaly management for line charts
      • Instance level
      • Vizpad level
      • Chart level
    • โณTime taken to load a chart
      • Instance level
      • Vizpad level
      • Chart level
    • โ™Ÿ๏ธWorking with sample datasets
    • ๐Ÿ”Swapping Business View of charts
      • Swapping only the current Vizpad
      • Swapping multiple objects
      • Configuring the time of swap
    • ๐Ÿค–Explainable AI charts
  • ๐Ÿ’กInsights (Discover)
    • ๐Ÿ‘‹Get familiar with our Insights
    • โ“Understanding the types of Insights
    • ๐Ÿ•ต๏ธโ€โ™‚๏ธDiscovery Insights
    • โž•How to create new Insights
      • ๐Ÿ”›Creating Discovery Insight
      • ๐Ÿ”‘Creating Key Driver Insights
      • ใ€ฐ๏ธCreating Trend Insights
      • ๐Ÿ‘ฏCreating Comparison Insights
    • ๐ŸงฎThe art of selecting columns for Insights
      • โžก๏ธHow to include/exclude columns?
  • ๐Ÿ”ขData
    • ๐Ÿ‘‹Get familiar with our Data module
    • ๐Ÿฅ‚Connect
    • ๐ŸชนCreate new datasource
      • Connecting to Oracle database
      • Connecting to MySQL database
      • Connecting to MS SQL database
      • Connecting to Postgres SQL database
      • Connecting to Teradata
      • Connecting to Redshift
      • Connecting to Hive
      • Connecting to Azure Blob Storage
      • Connecting to Spark SQL
      • Connecting to generic JDBC
      • Connecting to Salesforce
      • Connecting to Google cloud SQL
        • Connecting to a PostgreSQL cloud SQL instance
        • Connecting to an MSSQL cloud SQL instance
        • Connecting to a MySQL Cloud SQL Instance
      • Connecting to Amazon S3
      • Connecting to Google BigQuery
        • Steps to connect to a Google BigQuery database
      • Connecting to Snowflake
        • OAuth support for Snowflake
        • Integrating Snowflake with Azure AD via OAuth
        • Integrating Snowflake with Okta via OAuth
        • Azure PrivateLink
        • AWS PrivateLink
        • Best practices
      • Connecting to Databricks
      • Connecting to Databricks Delta Lake
      • Connecting to an AlloyDB Cluster
      • Connecting to HDFS
      • Connecting to Looker SQL Interface
      • Loading Excel sheets
      • ๐ŸšงUnderstanding partitioning your data
    • โณTime-to-Live (TTL) and Caching
    • ๐ŸŒทRefreshing a datasource
    • ๐ŸชบManaging your datasets
      • Swapping datasources
    • ๐ŸฃPreparing your datasets
      • ๐ŸคพActions that can be done on a dataset
      • Data Pipeline
      • SQL code snippets
      • โœ๏ธWriteback window
      • ๐ŸงฉEditing Prepare โ†’ Data
      • Handling null or mismatched values
      • Metadata view
      • List of icons and their actions
        • Functions
        • SQL Transform
        • Python Transform
        • Standard Aggregation
        • Creating Hierarchies
      • Dataset Scripting
      • Fusioning your datasets
      • Scheduling refresh for datasets
    • ๐ŸฅPreparing your Business Views
      • ๐ŸŒŸCreate a new Business View
      • Creating calculated columns
      • Creating dynamic parameters
      • Scheduling refresh for Business Views
      • Setting up custom calendars
    • Tellius Engine: Comparison of In-Memory vs. Live Mode
  • Feed
    • ๐Ÿ“ฉWhat is a Feed in Tellius?
    • โ—Alerts on the detection of anomalies
    • ๐Ÿ“ฅViewing and deleting metrics
    • ๐Ÿ–ฒ๏ธTrack a new metric
  • Assistant
    • ๐Ÿ’Introducing Tellius Assistant
    • ๐ŸŽคVoice-based Assistant
    • ๐Ÿ’ฌInteracting with Assistant
    • โ†–๏ธSelecting Business View
  • Embedding Tellius
    • What you should know before embedding
    • Embedding URL
      • ๐Ÿ“ŠEmbedding Vizpads
        • Apply and delete filters
        • Vizpad-related actionTypes
        • Edit, save, and share a Vizpad
        • Keep, remove, drill sections
        • Adding a Viz to a Vizpad
        • Row-level policy filters
      • ๐Ÿ’กEmbedding Insights
        • Creating and Viewing Insights
      • ๐Ÿ”ŽEmbedding Search
        • Search query execution
      • Embedding Assistant
      • ๐Ÿช„Embedding Kaiya
      • Embedding Feed
  • API
    • Insights APIs
    • Search APIs
    • Authentication API (Login API)
  • โœจWhat's New
    • Release 5.4
      • Patch 5.4.0.x
    • Release 5.3
      • Patch 5.3.1
      • Patch 5.3.2
      • Patch 5.3.3
    • Release 5.2
      • Patch 5.2.1
      • Patch 5.2.2
    • Release 5.1
      • Patch 5.1.1
      • Patch 5.1.2
      • Patch 5.1.3
    • Release 5.0
      • Patch 5.0.1
      • Patch 5.0.2
      • Patch 5.0.3
      • Patch 5.0.4
      • Patch 5.0.5
    • Release 4.3 (Fall 2023)
      • Patch 4.3.1
      • Patch 4.3.2
      • Patch 4.3.3
      • Patch 4.3.4
    • Release 4.2
      • Patch 4.2.1
      • Patch 4.2.2
      • Patch 4.2.3
      • Patch 4.2.4
      • Patch 4.2.5
      • Patch 4.2.6
      • Patch 4.2.7
    • Release 4.1
      • Patch 4.1.1
      • Patch 4.1.2
      • Patch 4.1.3
      • Patch 4.1.4
      • Patch 4.1.5
    • Release 4.0
Powered by GitBook

ยฉ 2025 Tellius

On this page
  • Steps to perform standard aggregation
  • Standard tab
  • Time Series tab
  • Pivot tab
  • Modify the existing dataset or create a new one

Was this helpful?

Export as PDF
  1. Data
  2. Preparing your datasets
  3. List of icons and their actions

Standard Aggregation

Simplify grouped aggregations and create multi-dimensional views for advanced summaries

PreviousPython TransformNextCreating Hierarchies

Last updated 4 months ago

Was this helpful?

Standard Aggregation allows you to summarize your dataset to extract meaningful insights. By grouping data and applying aggregate functions (e.g., sum, average), you can analyze specific groups or variables. It's particularly useful for creating summaries, generating metrics, and preparing data for further analysis.

You can use standard aggregation for:

  • Sales analysis: Group sales data by region and calculate total revenue per region using the Sum function.

  • Customer insights: Group customer data by age groups and calculate the average purchase value per group using the Average function.

  • Performance metrics: Calculate the count of transactions for each product category and display the ratio of each category to the total transactions.

Steps to perform standard aggregation

  1. Under Data โ†’ Prepare โ†’ Data, select the required dataset and click on Edit.

  2. Above Data Pipeline, click on the Aggregate option.

  1. The following window will be displayed.

  1. You can find three tabs: Standard, Time series, and Pivot.

Standard tab

Used for basic aggregation of data based on specific fields or time ranges. Ideal when you want to calculate standard metrics (like sum, average, count) for grouped data and get quick summary-level aggregations like total sales per category.

Enter the appropriate values in the following fields:

  • TimeRange: Select the time range to select the data from the columns.

  • Date/Time Column: Enter the date and time column that you want to use for the aggregate.

  • Group by Field: Enter the field name by which you want to group the data in the dataset (e.g., categories, regions).

  • Select aggregation column and type: Select the column that you want to calculate for the group and the type of aggregation (e.g., sum, average).

  • (Optional) Add an aggregation column to add additional aggregate column and type.

  • (Optional) Advanced Filters to add filter condition for your aggregate.

Time Series tab

Time Series tab helps in aggregating data over time intervals to analyze trends. Example: Time-series analysis, forecasting trends, and monitoring performance over time. Ideal when you want to calculate metrics over time, such as daily, weekly, or monthly sales.

Enter the appropriate values in the following fields:

  • TimeRange: Select the time range to select the data from the columns.

  • Date/Time Column: Enter the date and time column that you want to use for the aggregate.

  • Resolution: Define the time granularity (e.g., daily, weekly, monthly).

  • Select aggregation column and type: Select the column that you want to calculate for the group and the type of aggregation (e.g., sum, average).

  • (Optional) Add an aggregation column to add additional aggregate column and type.

  • (Optional) Advanced Filters to add filter condition for your aggregate.

Pivot tab

Pivot tab enables data pivoting for multi-dimensional summaries. For example, when you want to transform raw data into a tabular format with rows, columns, and aggregate values. Ideal for generating pivot tables for cross-tabulated summaries, such as sales per product per region.

Enter the appropriate values in the following fields:

  • Select rows and columns: Choose fields to be displayed as rows and columns in the pivot table.

  • Select aggregation column and type: Select the column that you want to calculate for the group and the type of aggregation (e.g., sum, average).

  • (Optional) Add an aggregation column to add additional aggregate column and type.

  • (Optional) Advanced Filters to add filter condition for your aggregate.

Modify the existing dataset or create a new one

  1. Click on Next button to proceed or click on Cancel to discard. The following window will be displayed.

  1. Modify existing dataset: Updates the current dataset directly with the new aggregated columns. Choose this option if you want to overwrite or enhance the existing dataset with the aggregation results.

  2. Create a copy with a new dataset name: Creates a new dataset that includes the aggregation results without altering the original dataset.

    • Dataset name field: Required if you select the "Create a copy with new dataset name" option. This is where you specify a unique name for the new dataset.

  3. Count (No. of records)

    • Allows you to specify the number of records to include in the aggregated dataset.

    • Use this if you only need a subset of records, defined by their count, for analysis or storage.

  4. Percent (0 to 100%)

    • Lets you define what percentage of the total records should be included in the aggregated dataset.

    • Use this to sample a proportion of the data for specific use cases like testing or preliminary analysis.

  5. Click on Prev to go back to the previous step or click on Submit to finalize your aggregation selections.

๐Ÿ”ข
๐Ÿฃ
Data โ†’ Prepare โ†’ Edit
Aggregate window
Standard tab
Time Series tab
Pivot tab
Modify existing dataset or create a new one