Analysis Philosophy: TLOGic provides multiple analysis perspectives (departments, operators, customers, items, time-based) to help you understand your business from different angles. Use these views together for comprehensive insights.
Department Analysis
Understanding Department Performance
The Departments tab provides visual and tabular analysis of sales performance across all product categories.
Key Metrics Available:
Total Sales
Gross sales amount by department
Returns
Return amounts and rates
Voids
Voided transaction amounts
Item Count
Number of distinct items sold
Interactive Treemap
The department treemap visualization uses:
- Rectangle Size: Proportional to sales amount (larger = more sales)
- Color Intensity: Indicates return rate (darker = higher returns)
- Hover Details: Shows exact values for department
Business Applications:
- Identify top revenue-generating departments
- Spot departments with concerning return rates
- Allocate floor space based on sales performance
- Set inventory priorities
Operator Performance Analysis
Evaluating Cashier Performance
Operator analytics help identify training needs, recognize top performers, and maintain service standards.
Core Performance Metrics:
- Transaction Count: Volume of transactions processed
- Sales Amount: Total dollar value processed
- Average $/Transaction: Indicates upselling success or transaction complexity
- Void Rate: Frequency of voided transactions (lower is better)
- Refund Rate: Frequency of return processing
- Ring Time: Speed of item scanning/entry
- Tender Time: Speed of payment processing
- Inactive Time %: Logged-in time not processing transactions
Using Composite Scores:
The composite score combines multiple metrics with customizable weights. Default weights:
- Sales Amount: +0.35 (highest importance)
- Transaction Count: +0.25
- Avg $/Txn: +0.15
- Void Rate: -0.15 (penalty)
- Void $ Ratio: -0.10
- Refund Rate: -0.05
- Refund $ Ratio: -0.05
Tip: Adjust weights in the Operator tab to match your business priorities. For example, increase void rate penalties if accuracy is critical, or emphasize speed metrics for high-volume environments.
Analysis Workflows:
- Performance Reviews: Sort by composite score to identify top/bottom performers
- Training Needs: Filter operators with high void rates or slow timing metrics
- Recognition Programs: Export top performers' data for awards
- Scheduling: Identify high-performers for peak hours
Customer Analysis
Customer Insights
When customer identifiers are present in your TLOG data, analyze shopping patterns and loyalty program effectiveness.
- Visit Frequency: Identify repeat customers vs. one-time shoppers
- Basket Size: Average transaction values per customer
- Category Preferences: Which departments do customers frequent
- Loyalty Impact: Compare member vs. non-member behavior
Item-Level Analysis
Product Performance
Drill down to individual SKU performance for merchandising and inventory decisions.
Key Questions to Answer:
- Which products have the highest sales velocity?
- Are there items with unusually high return rates?
- Which price points perform best?
- Are promotional items moving as expected?
Using Item Data:
- Sort by sales quantity to find top movers
- Sort by return rate to identify problem products
- Export data for inventory management systems
- Cross-reference with department data
Time-Based Analysis (Sales Heatmap)
Sales Patterns Over Time
The Sales/HR tab visualizes transaction patterns by hour and day of week, essential for labor scheduling and operational planning.
Reading the Heatmap:
- Rows: Days of the week (Monday-Sunday)
- Columns: Hours of operation
- Color Intensity: Transaction volume or sales amount
- Hover: View exact transaction counts and amounts
Business Applications:
- Labor Scheduling: Schedule more staff during peak hours (darker colors)
- Break Planning: Schedule breaks during slower periods
- Promotions: Time marketing campaigns for high-traffic periods
- Training: Schedule training during consistently slow hours
- Efficiency: Identify underutilized time periods for operational improvements
Example Insights:
Pattern: High volume Mon-Fri 12-2pm, Sat-Sun 10am-6pm
Action: Increase staffing during these windows, consider extended hours on weekends
Export and Reporting
Getting Data Out of TLOGic
Export data for deeper analysis in Excel, databases, or business intelligence tools.
Export Options:
- CSV Format: Universal format for most software (Excel, databases, BI tools)
- Excel Format: Formatted spreadsheets with proper data types
- Raw Binary: Original TLOG records for archival or specialized analysis
Export Workflows:
- Use search/filters to select specific data subsets
- Select transactions or use "Select All" for bulk export
- Click export button and choose format
- Large exports (10,000+ records) use optimized bulk processing
Tip: For regular reporting, create a consistent workflow: load TLOG → apply standard filters → export → import into your reporting template.
Combining Analysis Perspectives
Multi-Dimensional Analysis
Get the most insights by combining multiple analysis views:
Example: Investigating High Returns
- Department View: Identify which department has high return rate
- Item View: Find specific products driving returns
- Operator View: Check if returns concentrate around specific operators
- Time View: See if returns happen at specific times
- Raw Data: Examine individual return transactions for patterns
Example: Optimizing Labor Costs
- Heatmap: Identify peak and slow periods
- Operator Data: See transaction counts per operator per shift
- Raw Data Filtering: Analyze specific time windows
- Export: Create labor planning reports
Best Practices for Analysis
Do This
- Analyze full business days for consistent metrics
- Compare week-over-week or month-over-month
- Look for patterns, not just individual data points
- Cross-validate findings across multiple views
- Document insights and track improvements
- Share findings with relevant stakeholders
Avoid
- Drawing conclusions from partial day data
- Ignoring context (holidays, promotions, staffing changes)
- Over-reacting to single-day anomalies
- Analyzing corrupt or incomplete files
- Making major decisions without verifying data accuracy
- Forgetting to export important findings