Escolar Documentos
Profissional Documentos
Cultura Documentos
Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Contents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Related Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
1
Getting Started . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Starting the SdE from the CIW or the Schematic Window . . . . . . . . . . . . . . . . . . . . . . . . 20
Starting the SdE from the System Prompt . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Solving Color-Flashing Problems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Defining Colors for Motif Window Manager/OpenWindows . . . . . . . . . . . . . . . . . . . . 22
Disabling the Splash Screen . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Workspaces and Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
VSdE Main Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Opening a New Project Item . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Opening an Existing Project Item . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Importing State Information for Test Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Importing Information into an Existing Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Exporting Test Setup Information to a State File . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Renaming a Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Bringing Open Windows into the Foreground . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
Moving Projects and Workspaces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Creating New Workspace Folders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Project Tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Documents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Sweeps/Corners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Spec Sheets . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Model Calibrations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Characterization and Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
Plans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2
Creating Workspaces and Projects. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Creating and Opening a Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Creating a New Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Copying a Workspace from Design Management . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Opening an Existing Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Opening a Recently-Opened Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Opening the Environment in Read-Only Mode . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
Adding a Project to a Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
Creating a New Project . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
Creating a New Project Based on a Previous Project . . . . . . . . . . . . . . . . . . . . . . . . 61
Adding Project Items and Design Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Using the File Menu to Add a Project Item or Design File . . . . . . . . . . . . . . . . . . . . . 63
Using the Add Menu to Add a Project Item or Design File . . . . . . . . . . . . . . . . . . . . . 63
Creating and Managing Workspace Templates . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Creating a New Workspace Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
Importing a Workspace Template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
3
Managing Workspaces and Projects . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Design Management Operations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
Design Management Check In Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4
Understanding Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Project Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
Workspace Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
New Workspace Value Set Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Workspace Parameters Table Buttons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98
Workspace Set Right-Click Pop-Up Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99
Workspace Value Set Operations on Project Menu . . . . . . . . . . . . . . . . . . . . . . . . . 100
Global Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
Local Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
5
Test Setup from the Schematic Window . . . . . . . . . . . . . . . . . . . . . . 103
Advanced Analyses and Tasks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
Environment Variables in Test Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Starting the SdE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Starting the SdE from the Schematic Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 105
Starting the SdE from the Command Interpreter Window . . . . . . . . . . . . . . . . . . . . 106
Create Test Form—Composer Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
Selecting a Simulator Integration Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
Copying Test Setup Information from a Template . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Importing Simulation Control Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
Copy Test Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 110
6
Creating Tests for an ADE Simulator . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Create Test Form—ADE Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
Test Setup Window—ADE Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
Project Parameters in ADE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 168
Test Setup Customization for Other Simulators . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
FILE_VERSION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171
BUTTON . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
ANALYSIS_TYPE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173
OPPTNV_SETUP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174
VIEW_NETLIST_RUN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
VIEW_NETLIST_TEST . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
EXPAND_DOT . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 176
STATUS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
7
Creating Tests for Spectre Simulation . . . . . . . . . . . . . . . . . . . . . . . . . 179
Create Test Form—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
Specifying a Netlist Location . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 182
Specifying a Design Location . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185
Test Setup Window—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 186
Netlist Tab—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
Includes Tab—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191
Analyses Tab—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
Sim Options Tab—Spectre Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 198
8
Creating and Running Advanced Analyses . . . . . . . . . . . . . . . . . . 201
Sweeps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
Creating a New Sweep . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
Specifying Tests for the Sweep . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Defining Sweep Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
Converting a Sweep to a Corners Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
Corners Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
Specifying Tests for Corners Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
Defining Corners Parameters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
Adding a Sweep of Corners . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
Monte Carlo Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
Job Status Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
Results Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
Copying Advanced Analysis Project Items . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
9
Specifying Run Options . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Overrides Group Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225
Overriding the DUT Location for a Schematic Design . . . . . . . . . . . . . . . . . . . . . . . 226
Overriding the DUT Location for a Netlist-Only Design . . . . . . . . . . . . . . . . . . . . . . 226
Job Distribution Group Box . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
Launching Remote Jobs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 229
10
Viewing and Analyzing Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 247
Using the Results Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
Expanding a Folder or Other Result Item . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
Viewing Available Result Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
Opening a Data File for Plotting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Opening an Association for Plotting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Flattening and Unflattening Branches . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 252
Displaying Leaf Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Plotting Traces . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
Viewing Tabular Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Viewing Association Properties . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
Customizing Default Double-Click Action for Leaf Data . . . . . . . . . . . . . . . . . . . . . . 255
Viewing Table Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255
Formatting Table Cells . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260
Analyzing Operating Point Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 263
Filtering Operating Point Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
Specifying a Filter List . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
Back-Annotating Operating Point Information to Schematic Instances . . . . . . . . . . 269
Specifying Instance Annotation Display Options . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
Back-Annotating Node Information to a Schematic . . . . . . . . . . . . . . . . . . . . . . . . . 271
Setting Up Component Parameters and Building Expressions . . . . . . . . . . . . . . . . 272
Using the File Menu . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274
Viewing Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 276
Viewing Waveforms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 278
Waveform File Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
Waveform Annotation Check Boxes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 280
11
Verifying and Comparing Designs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 293
Creating and Opening a Spec Sheet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 294
Spec Sheet Buttons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
Spec Sheet Toolbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
Spec Sheet Columns . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 296
Adding and Removing Specifications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301
Add Spec Row—Results vs. Spec . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 302
Add Spec Row—Comparison of Two Designs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303
Select Results Directory Form . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
Loading Result Data Automatically . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
Configuring Columns, Rows, and Cells . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305
Using the Cell Formatting Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
Format Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 308
Alignment Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
Color Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
Border Tab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
Comparing Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
Changing Results Directories . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
Investigating Pass/Fail Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
Pass/Fail Details for Results vs. Spec . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
Pass/Fail Details for Comparison Spec Sheet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 316
12
Calibrating Behavioral Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
Preparing a Model for Calibration Using a Look-Up Table . . . . . . . . . . . . . . . . . . . . . . . 318
Starting Model Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319
13
Developing and Editing a Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339
Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 339
Creating a Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
Exporting to a Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
Linking to a Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 342
Creating a Hierarchical Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 343
Using the Plan Editor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348
Running a Plan from the Environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 348
Running a Plan from the Command Line . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 349
14
Learning by Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 351
Opening the Design in the Schematic Window . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
Creating a New Workspace . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 352
Creating a Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 353
Specifying Include Information . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354
Getting Global Design Variable Information from the Schematic . . . . . . . . . . . . . . . 354
Specifying Analyses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
Specifying Test Measures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355
Running the Test and Viewing Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 357
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367
Preface
Contents
The Virtuoso Specification-driven Environment User Guide contains the following
chapters:
■ Chapter 1, “Getting Started,” presents a brief introduction to the environment.
■ Chapter 2, “Creating Workspaces and Projects,” introduces workspace and
project creation.
■ Chapter 3, “Managing Workspaces and Projects,” describes the Design Management
feature for workspaces and projects.
■ Chapter 4, “Understanding Parameters,” introduces use of project, global design,
and local design parameters.
■ Chapter 5, “Test Setup from the Schematic Window,” discusses creating tests through
the Virtuoso® Schematic Editor design environment.
■ Chapter 7, “Creating Tests for Spectre Simulation,” describes test creation and setup for
Spectre circuit simulation.
■ Chapter 6, “Creating Tests for an ADE Simulator,” describes test creation and setup
targeted for an ADE-available simulator.
■ Chapter 8, “Creating and Running Advanced Analyses,” demonstrates setup of
advanced analyses including parametric sweeps, corners, and Monte Carlo analyses.
■ Chapter 9, “Specifying Run Options,” describes advanced use functions for sweeps and
corners analyses.
■ Chapter 10, “Viewing and Analyzing Results,” describes use of the Results tab and
results viewing and plot editing packages.
■ Chapter 11, “Verifying and Comparing Designs,” describes how to create a spec sheet
for verification of results or comparison with another design.
■ Chapter 12, “Calibrating Behavioral Models,” describes how to use the environment to
generate silicon-calibrated behavioral models.
■ Chapter 13, “Developing and Editing a Plan,” defines how to create
characterization, model calibration, and other plans.
■ Appendix G, “Library Mapping File,” specifies syntax for items in the library mapping
file.
■ Appendix H, “VSdE Directories and File Formats,” outlines workspace and project
directory contents and file formats.
■ Appendix I, “Printer Setup,” describes setting up one or more printers to work with
Virtuoso Specification-driven Environment.
Related Documents
The following supplemental reference documents are recommended for use with this product:
■ Virtuoso Spectre Circuit Simulator User Guide and Reference provides information
on syntax and options for Spectre circuit simulation
■ Virtuoso Parasitic Simulation User Guide provides information on how to investigate
and report on the effect of physical parasitics on circuit designs
■ Analog Waveform User Guide provides instruction on how to use the Analog
Waveform viewer
■ MATLAB Measures User Guide provides information about using Cadence MATLAB
measures in VSdE and the MATLAB environment
■ Perl Programming Guides provide information that is useful for creating plans in VSdE
1
Getting Started
Overview
The Virtuoso® Specification-driven Environment (SdE) is an interactive design environment
for the analysis, characterization, and verification of analog, digital, and mixed-signal circuits.
The SdE is also integrated with the Virtuoso Schematic Editor and can be used with multiple
designs simultaneously. For increased performance you can investigate parasitics in your
design and further refine the extracted view prior to simulating (see the Virtuoso Parasitic
Simulation User Guide for more information).
Note: The Virtuoso Specification-driven Environment is OpenAccess compliant.
Tests
Build
Analysis Analysis
Simulation
Netlists
Device
Under Spec Sheet
Test
Simulator
Pass/Fail
Verification
Report
For ease of reuse, and organizing design data and results, projects are organized into
workspaces using a lib/cell/view hierarchy. One or more related projects (or designs) can be
made available in the environment at the same time. Multiple tests can be used to
characterize a design, and a sweep analysis can vary one or more parameters over multiple
tests in a project. The Spec Sheet tool can be used to verify whether a circuit’s target behavior
and performance goals are met. The environment supports distributed simulation to execute
multiple tests and experiments in parallel. silicon-calibrated behavioral models help to reduce
simulation times significantly. The environment promotes reuse of design components and
the capture of design Intellectual Property.
Option Description
-batch Exits the environment after running a plan or Virtuoso®
Characterization and Modeling Environment file (see -runplan
and -rundcm)
-display display_name Opens environment windows and forms on the specified X
display (display_name )
-fullhelp Opens the Online Help window
-help Displays vsde command syntax and option usage information
-nosplash Prevents display of the splash screen on startup
-project apf_or_awf Provides a mechanism for specifying the path to and name of
[-active project_name]
an existing project or workspace file; for example:
vsde -project /home/user/projects/MyProj.awf
The -active command-line option can be used to specify the
active project in a workspace as follows:
[…] -active twoInputNand
-readonly Opens the environment in read-only mode: no workspace or
-ro
project item can be created, modified, or deleted; most Design
Management operations are disabled
Option Description
-runplan plan_file Runs either a named plan file (plan_file ), or performs the
or Virtuoso Characterization and Modeling Environment
Generate & Run operation from the setup information
-rundcm cme_file
contained in the specified .dcmui file (cme_file ):
vsde -runplan plans/myPlan.pl
vsde -rundcm modelgen/twoInputNand
If -project is not specified, then the Select Workspace form is opened (see “Creating and
Opening a Workspace” on page 53).
Important
The $ACV_ROOT environment variable must be defined in order to run
mwcolormanager.
Note: The mwcolormanager command is executed only upon the first login to the desktop. The
mwcolormanager program must continue to execute in an infinite loop so that the colormap
entries it allocates will not be marked as free. This loop consumes extremely few CPU
resources and has no interface to generate X traffic.
To use the xrdb method of defining colors, put the following xrdb command in your X server
startup script:
xrdb -merge $ACV_ROOT/admin/Xresources
Workspace parameters belong to a workspace and take precedence over project parameters:
If a parameter of the same name exists in both the Workspace Parameters table and in the
Project Parameters table, the workspace value takes precedence. Using workspace
parameters, multiple projects can share a common set of parameter values (for example,
$temperature, $process, $vdd_value).
Workspace parameters are grouped together in value sets. A value set is a logical grouping
of workspace parameters that are varied together. For example, a set of parameters for
1.8-volt power and a set of parameters for 3.3-volt power.
Plans created in one workspace project can run plans in other workspace projects.
Simulation results are grouped by workspace value set name and can be compared using the
Spec Sheet tool.
■ Main menu bar and toolbar (see “Menus” in the Virtuoso Specification-driven
Environment Reference)
■ Files tab, which contains the following items:
❑ Projects in the current workspace
❑ Project tool folders for Documents, Tests, Sweeps/Corners, Spec Sheets,
Model Calibrations, Characterization and Modeling, and Plans for each
project in the workspace
❑ Project files in each tool folder
To open the Create Test form for a test, or a New Project_File form for a sweep, corner,
spec sheet, model calibration, characterization and modeling item, or plan (see the table
below), do the following:
1. In the New File window, select a VSdE Files project item or a Design Files item.
2. Click OK.
Note: For Design Files, an untitled text editor window is opened (see “Text Editor” in the
Virtuoso Specification-driven Environment Reference).
Tip
An item can be renamed by right-clicking its name and selecting Rename from the
pop-up menu. See also “Renaming a Test” on page 34.
2. In the ADE state file directory field, specify the path to a state directory.
Note: The button is used to open the Select ADE State File Directory form for
navigating to and selecting the directory containing state files. The default starting point
is ~/.artist_states/libName/cellName, unless a different root directory is specified in
.cdsenv using the asimenv saveDir setting (for example, ./artist_states).
3. In the Test name field, type a name for the test to be created from the information in the
selected state file.
4. In the Test view field, type a view name for the test.
Note: The lib/cell information is obtained from the state file.
5. In the Integration group box, select either VSdE Native or Other (ADE-available)
simulator integration.
6. Select an item from the Simulator drop-down list.
7. Click OK to accept changes and close the form.
Item Description
ADE state file Specify the path to an analog design environment state directory
directory
Test name Specify the name of the test to be created from the information in
the selected analog design environment state file
Test view Specify the view name for the test
Note: The lib/cell information is obtained from the state file.
Integration Select either VSdE Native (Spectre) or Other (ADE-available)
simulator integration (with simulator choices available on a
drop-down list)
To import Test Setup information from another test in the project, do the following on the
Import Test form:
1. Select From test in project.
2. Select a test in the current project from the drop-down list.
Importing Test Setup Information from a Test That is Not in the Current Project
To import Test Setup information from a test that is not in the current project, do the following
on the Import Test form:
1. Select From other test file.
2. Click to open the Select a Test/Template File form.
3. In the Select a Test/Template File form, navigate to and select a test (.tst) file, and click
Open.
4. Mark/unmark Import customization check boxes (see “Customizing Imported Test
Setup Information” on page 32).
5. Click OK to complete the import operation.
6. In the Test Setup window, click Apply to accept the changes, or cancel to close the
window without saving the changes.
To import Test Setup information from analog design environment state files, do the following
on the Import Test form:
1. Select From ADE.
2. Click to open the Select ADE State Directory form.
3. In the Select ADE State Directory form, navigate to and select a directory from which to
import states, and click Open.
4. Mark/unmark Import customization check boxes (see “Customizing Imported Test
Setup Information” on page 32).
5. Click OK to complete the import operation.
6. In the Test Setup window, click Apply to accept the changes, or cancel to close the
window without saving the changes.
Import customization check boxes are used to enable/disable the import of specific Test
Setup information. When a check box is marked, the imported information is added to any
information already specified on the indicated tab of the Test Setup window. Any imported
information that matches what is already specified on the indicated tab overrides the
existing information on that tab. By default, all check boxes are marked when the form is first
opened. The following Import customization selections are available:
Test information is displayed in the Library, Cell name, and Simulator name fields of the
Export Test form. Do the following:
1. Either type a valid State save dir (state directory), or click to open the Select ADE
State Directory form for navigating to and selecting a state directory (click OK when
finished).
Note: The State save dir field is populated with the asimenv saveDir setting, if there is
one, or with ~user/.artist_states otherwise.
2. Type a State name, which becomes a directory in the State save dir in which the state
information is saved.
3. Click OK to export the following state information:
❑ All enabled analyses
❑ Simulation files
❑ Environment options
❑ Simulator options
❑ Convergence settings
❑ Model setup information
❑ Outputs (selected list, plot list, and Calculator measures)
❑ Design variables
Renaming a Test
To rename a test, do the following:
1. In the Tests folder in the VSdE main window, right-click the test you want to rename.
2. Select Rename from the pop-up menu.
The Rename Test form appears.
Tip
Alternatively, right-click a Workspace or workspace folder and select Add
Workspace Folder.
2. Type a name for the new workspace folder.
3. Click OK to add the new folder to the workspace.
Project Tools
For an open project, the following project tool folders are displayed on the Files tab in the
VSdE main window (see “VSdE Main Window” on page 24):
■ Documents on page 36
■ Tests on page 40
■ Sweeps/Corners on page 41
■ Spec Sheets on page 42
■ Model Calibrations on page 42
■ Characterization and Modeling on page 42
■ Plans on page 43
Note: A plus sign on a folder indicates that the folder contains files.
Documents
The Documents folder contains project documentation, which can consist of the following
document types:
■ Text format design files created using the built-in text editor or a user-configured editor
(see “Creating New Project Files or Documents” on page 37 and “Configuring Other
Editors” on page 39)
■ Non-text documentation in any of the following formats: PDF, HTML, GIF (see “Adding
Non-Text Documentation Files to a Project” on page 37)
The built-in text editor can be used to create (text format) design files of the following types
for a project:
■ Verilog-AMS (*.v, *.va, *.ams, *.def)
■ SPICE (*.sp, *.spc, *.ckt, *.inc)
■ Perl (*.pl)
■ MATLAB (*.m)
■ Text (*.txt)
Any of the following methods can be used to display the text editor for creating new project
design files of the above-mentioned types:
■ In the VSdE main window, choose Add – New Document:
■ In the VSdE main window, click on the toolbar
■ On the Files tab of the VSdE main window, right-click the Documents folder and select
New Document from the pop-up menu:
■ Right-click an existing document in the Documents folder and select New from the
pop-up menu:
The following non-text file types can be stored in and viewed from the Documents folder of
a project:
■ HTML (*.html)
■ PDF (*.pdf)
■ GIF (*.gif)
The Path field in the Properties window for the Documents folder displays the full path to the
directory location of the project documents. To view the path, do the following:
1. Right-click the Documents folder.
2. Select Properties from the pop-up menu.
Any file of the above-mentioned types can be copied to the docs directory for a project so
that it is displayed in the Documents folder.
Note: For spec sheets exported to HTML, the resulting file is copied to the docs directory
automatically (see "Spec Sheet Toolbar" in the Virtuoso Specification-driven
Environment User Guide).
To open a documentation file displayed in the Documents folder, do one of the following:
■ Double-click the icon to the left of the item.
■ Right-click the item and select Open from the pop-up menu.
Note: The editor used to display each file type is specified in edit_command.ini (see
“Configuring Other Editors” on page 39).
Any file of the following formats saved to the Documents folder for a project can be displayed
by double-clicking its icon in the open Documents folder (or selecting Open from the
right-click pop-up menu):
■ Text (*.txt)
■ HTML (*.html)
■ PDF (*.pdf)
■ GIF (*.gif)
Note: By default, text files are displayed using the built-in text editor, PDF files are displayed
using acroread, and HTML and GIF files are displayed using Netscape. See also
“Configuring Other Editors” on page 39.
The edit_command.ini file in $ACV_ROOT/admin is used to specify the default editor that
will display the following design file types:
Note: By default, text files are displayed using the text editor (builtin), PDF files are
displayed using acroread, and HTML and GIF files are displayed using Netscape.
Any file extension can be configured using the following general form:
extension "commandString"
where extension is the file extension and commandString is the editor command string.
Surrounding quotation marks are required for any editor command string that contains one or
more blank spaces. If %s is specified as part of the string, then it is replaced by the
fully-qualified path to and name of the file when the command is invoked. If %s is not specified
as part of the string, then the fully-qualified path to and name of the file is appended to the
end of the command string.
The following command specifies the Mozilla browser for displaying HTML documents:
html "/usr/bin/mozilla %s"
The following command specifies the vi editor run in an Xterm window as the default for
editing/displaying SPICE design files:
spice "xterm -geometry 100x30 -e vi"
The following command specifies the CDE editor as the default for editing/displaying Verilog
design files:
verilog dtpad
2. Open vi (for example) to edit or add the desired command string to specify the desired
default editors:
vi edit_command.ini
…
html "/usr/bin/mozilla %s"
Tests
The Tests folder contains tests used to set up the simulations for a device under test (DUT).
Tests define the stimuli, analyses, simulator options, and measures necessary to run a
simulation. A test consists of the following items:
■ A single DUT
■ Zero or more components and measures
■ Parameter values for the DUT, components, and measures
■ Other simulation controls/commands (and control values)
■ Net names to be used to connect the DUT to the components
■ Connection mapping between components, measures, and DUT
Sweeps/Corners
The Sweeps/Corners folder can contain sweeps (parametric analyses) and corners
analysis setups.
A sweep is the changing of one or more parameters and obtaining simulation results for each
parameter value. Sweep values are defined as either a list of values or a range of values with
specified increments (from-to-by). A sweep can be applied to one test or to multiple tests.
Sweeps are accomplished by sweeping project parameters ($ Perl variables) over one or
more tests. Once the tests and sweep parameter have been defined, the sweep can be run.
Running a sweep generates one or more simulation files, and the output results data is
captured.
A corners analysis can consist of any number of model files and parameter settings. Any
combination of tests and conditions for the corners analyses can be specified. Each corner
condition corresponds to a separate simulation which can run independently.
Simulations can be distributed across multiple machines (see “Job Distribution Group Box”
on page 228), and separate jobs are automatically created based on the parameter values
being swept. The Job Status tab is used to view the progress and pass/fail status of the
analysis. Simulation log and error files can also be viewed from this tab.
A separate results file is created for each sweep/test or corner/test combination. A tabular
listing or waveform patterns of results can be viewed from the Results tab of the VSdE main
window (see “Using the Results Tab” on page 249).
For more information about sweeps and corners analysis setup, see Chapter 8, “Creating
and Running Advanced Analyses.”
Spec Sheets
The Spec Sheets folder contains design specifications, which can be used for design review
presentation and to compare against simulated results. These spec sheet setups can be
exported to Plans, and exported to HTML for formatted printing. In addition, the Spec Sheet
tool can be used to:
■ Manage and organize the data returned from single or multiple analyses
■ Report Pass/Fail conditions for measurements
■ Provide means for further inspection of the data
The Spec Sheet tool provides a means to verify a design against a set of required results, or
to verify one design (behavioral or circuit) against another (behavioral or circuit).
For more information about spec sheets, see Chapter 11, “Verifying and Comparing Designs.”
Model Calibrations
The Model Calibrations folder contains model calibration setups used to design and
generate silicon-calibrated behavioral models. Parameters in the calibrated (behavioral)
model take on real values (based on characterization results).
For more information about the Model Calibration tool, see Chapter 12, “Calibrating
Behavioral Models.”
For more information about the Characterization and Modeling tool, see the Virtuoso
Characterization and Modeling Environment User Guide. (For information about the
Includes and Sim Options tabs of the VCME Setup window, see “Includes Tab—Spectre
Tests” on page 191, and “Sim Options Tab—Spectre Tests” on page 198.)
Plans
The Plans folder contains plans that define the steps to be taken for a process. For example:
■ Characterization Plan – to obtain characterization results for a circuit
■ Model Generation Plan – to generate a silicon-calibrated behavioral model for a circuit
Plans, which are run on projects, are typically created using the Export to Plan right-click
menu item from the various other project items (Tests, Sweeps/Corners, Spec Sheets, Model
Calibrations). Also, for tests, sweeps, and corners analyses, the Link to Plan right-click menu
item can be used to create a reference to the project item from within a plan. All or parts of
files associated with tests, sweeps, corners analyses, spec sheet comparisons, and model
calibrations are available for export to a plan.
Alternatively, a plan can be created using the built-in text editor or an external editor. A plan
can be invoked either from the environment or from the system prompt. A project can contain
multiple plans—for example, one plan to create characterization results for the circuit, another
plan to generate a behavioral model. Often, one plan is designed to run other plans.
For more information about plans, see Chapter 13, “Developing and Editing a Plan,” and “Perl
Extensions” in the Virtuoso Specification-driven Environment Reference.
The Test Console tab is available only when the Files tab is active in the VSdE main window.
The Measures table is displayed only when Test Measures is marked on the View menu in
the VSdE main window (see “View Menu” in the Virtuoso Specification-driven
Environment Reference).
Run Button
The Run button can be found in the following analysis setup windows and is used to run
analyses:
■ Test Setup (see “Test Setup Window for Schematic-Based Tests” on page 112)
■ Sweep (see “Sweeps” on page 201)
■ Corners (see “Corners Analysis” on page 209)
■ Monte Carlo (see “Monte Carlo Analysis” on page 216)
If any changes have been made since the last Apply in any of these windows, a form appears
with a message and choices like the following prior to running the analysis:
Values have changed. Do you want to save the changes before run?
❑ Yes — Apply the changes and run the analysis.
❑ No — Do not apply the changes. Run the analysis using the last-saved settings.
❑ Cancel — Don’t apply the changes; don’t run the analysis.
Environment Customizations
(Optional) You can perform any the following environment customization tasks:
■ Specifying Environment Defaults Using Environment Variables on page 46
■ Configuring Other Editors on page 39
■ Creating a Custom Toolbar on page 46
■ Populating a Custom Toolbar on page 47
■ Removing an Icon from a Toolbar or Menu on page 47
■ Deleting a Custom Toolbar on page 48
■ Adding a Tool to the Tools Menu on page 48
■ Assigning a Shortcut to a Menu Item on page 49
■ Removing a Menu Shortcut on page 50
Important
The Delete action cannot be undone.
Important
The accelerator key must be unique (not assigned to another item on the Tools
menu).
Note: The up and down arrows on the toolbar ( ) are used to change the order in
which the tools appear on the Tools menu by moving a selected item up or down in the
list, respectively. The delete button ( ) is used to remove an item from the Tools menu.
5. At the right end of the Command field, click .
A form for navigating to and selecting the tool’s executable appears.
6. Navigate to and select the tool’s executable.
7. Click OK.
8. (Optional) In the Arguments field on the Tools tab, type any command-line arguments
for the tool.
9. At the right end of the Initial directory field, click the right-arrow button.
10. From the pull-right menu, choose one of the following for specifying the initial directory
for the tool:
❑ Current Results Directory ($currentResultsDir)
❑ Current Project Directory ($project)
❑ Browse (for any directory)
11. For applications that do not have a graphical user interface (GUI), check the Use xterm
with scrollbar check box.
12. Click Apply or OK.
4. Click Assign.
The Assign Shortcut form appears.
5. In the Press new shortcut key field, type the desired shortcut.
Note: If the specified shortcut is assigned to any other menu item, then that menu item
is indicated under Current Assignment. If the specified shortcut is not assigned to any
other menu item, then the Current Assignment is indicated as (none).
6. Click OK to assign the new shortcut to the selected menu item.
Important
The Current Assignment, if one exists, is replaced by the new assignment typed
on the Assign Shortcut form.
2
Creating Workspaces and Projects
Overview
A workspace is composed of one or more related projects and parameters. The workspace
directory is typically the top-level directory of the hierarchy. Multiple workgroups or members
of a workgroup can use this single directory to store and work with related projects.
A workspace can contain more than one project. A project can be referenced by more than
one workspace. A project and its related files are accessed by opening a workspace that
contains that project.
A workspace contains a working library for accessing the design files for a device under test
(DUT). The working library is a lib/cell/view file tree that stores design information. The
lib/cell/view browser within the environment allows selection of the proper circuit design file
(for example, SPICE, Verilog-AMS). If a behavioral model is created for a target DUT, then
the model is stored as a view for that cell in the working library.
Note: The libraries used in the environment can be the same libraries used by the Virtuoso
Schematic Editor (both are organized in a lib/cell/view hierarchy).
The first line points to the working library subdirectory (worklib); the second line
points to the default library mapping file in the release hierarchy (which provides the
location of the Cadence MATLAB measures for those with a full MATLAB license—see
“acv.lm Syntax” in the “Library Mapping File” appendix of the Virtuoso
Specification-driven Environment Reference).
■ Subdirectories for each of the project tools (see “Project Tools” on page 36)
■ A working library (worklib) structured in a lib/cell/view hierarchy in which netlists
generated for the project are stored
Note: A single project directory can contain multiple designs and characterization tests
for each design
Contains ota.apf,
a default library mapping file,
subdirectories for project tools,
and a working library (worklib)
To use the Virtuoso® Specification-driven Environment (SdE), design engineers create a new
workspace or open an existing workspace (see “Creating and Opening a Workspace” on
page 53). A workspace directory contains projects, design files, tests, results, and plans that
are related. Several related designs, and the characterization tests for these designs, can be
stored in the same workspace directory so that these designs can be characterized using a
common workspace. Tools within the environment can locate required data without having to
prompt the user for locations.
3. In the Location field, type a directory location for the new workspace.
Tip
Use the browser display button ( ) to open the Browse for Folder window for
navigating to and selecting a directory location for the new workspace.
4. In the Name field, type a name for the new workspace. This text appears automatically
at the end of the string in the Location field.
5. (Optional) Mark the Create new project check box to indicate that a new project is to
be created when the new workspace is created.
Note: This check box is marked by default unless the
If the Create new project check box is marked, then the New Project Wizard opens (see
“Adding a Project to a Workspace” on page 58). If the Create new project check box is not
marked, then an empty workspace (one with no projects) appears in the main window.
Tip
The ACV_DEFAULT_WORKSPACE_PATH environment variable can be set prior to
starting the environment to specify the full hierarchical path to a default workspace
from which to copy templates and parameters. When this environment variable is
set, the Copy templates and parameters from existing workspace check box
on the New Workspace form is marked by default, and the specified file is displayed
in the field beneath it. For example:
setenv ACV_DEFAULT_WORKSPACE_PATH /home/user/wksp/default.awf
Tip
See also Chapter 3, “Managing Workspaces and Projects.”
4. Click Open.
The last-saved version of the selected workspace is displayed in the VSdE main window.
2. Select a workspace from the drop-down list in the Open recent field.
3. Click OK.
The selected workspace, with its last-active project, are displayed in the VSdE main window.
When the Open VSdE in read-only mode check box is marked, then no workspace or
project item can be created, modified, or deleted. Most Design Management operations are
disabled (see “Design Management Operations” on page 73). Read-only mode ends when
the session ends.
Note: Marking this check box on the Select Workspace form has the same effect as starting
the environment at the system prompt using the -readonly command-line option (see
“Starting the SdE from the System Prompt” on page 20).
The first form of the New Project Wizard asks the following question:
Would you like to base your project on a previously created project?
Selecting No lets you create a new project (see “Creating a New Project” on page 60).
Selecting Yes lets you base the new project on an existing project (see “Creating a New
Project Based on a Previous Project” on page 61).
Tip
Click to open the Browse for Folder window for navigating to a project location.
❑ Name of the new project
❑ Description of the new project (optional)
Note: When running the New Project Wizard as a step in the process of creating a new
workspace (see “Creating a New Workspace” on page 54), the location and name of the
new project are the same as for the new workspace, by default. The check box beneath
the Description field is used to make the Location and Name fields active so that the
default information can be changed:
4. Click Finish.
The new project is displayed and active in the VSdE main window.
❑ Click to open the Select a Project File form, then navigate to and select a
previously created project file (.apf).
❑ Type the path to and name of a previously created project file (.apf) in the Project
file field.
3. Click Next.
4. Type the following information on the second New Project Wizard form:
❑ Location of the new project
Tip
Click to open the Browse for Folder window for navigating to a project location.
❑ Name of the new project
❑ Description of the new project (optional)
5. Click Finish.
The new project is displayed and active in the VSdE main window.
Project items include tests, sweeps, corners analyses, spec sheets, model calibrations,
characterization and modeling items, and plans. The following forms apply:
■ Create Test form—see one of the following:
❑ Create Test Form—Composer Tests on page 107
❑ Create Test Form—Spectre Tests on page 181
❑ Create Test Form—ADE Tests on page 165
■ New Sweep Name form—see “Sweeps” on page 201
■ New Corners Name form—see “Corners Analysis” on page 209
■ New Specification form—see “Creating and Opening a Spec Sheet” on page 294
■ New Model Calibration Name form—see “Starting Model Calibration” on page 319
■ New Characterization and Modeling form—see “Launching the CME from the SdE”
in the Virtuoso Characterization and Modeling Environment User Guide
■ New Plan form—see “Creating a Plan” on page 341
Design files include Verilog-AMS, SPICE, MATLAB, and text files (see “Text Editor” in the
Virtuoso Specification-driven Environment Reference).
When a “… by reference” template is used, the values from the template are referenced by
the test, sweep, or corners setup. Changes made to a “… by reference” template are
available to all tests, sweeps, or corners setups that reference that template.
Important
A “… by reference” template cannot be created for a test setup that is targeted for
an ADE-available simulator.
When a “… by copy” template is used, the values in the Template Setup form are copied
and used as starting values. Changes made to a template are available only to tests, sweeps,
or corners that import the template after changes have been made.
New templates can be created, and existing templates can be imported from existing tests,
sweeps, or corners setups. A single set of include file definitions can be referenced by
multiple tests by creating a Test includes by reference template. A new test can be created
based on another test by making the existing test available as a Test template in the
workspace. Similarly, a sweep or corners analysis can be created using a “… by copy”
template as a starting point or by referencing an appropriate “… by reference” template.
Template setup files are stored in the workspace directory (see “Overview” on page 51).
To open the Workspace Templates form for creating and managing workspace templates, do
one of the following:
■ In the VSdE main window, choose Project – Workspace Templates, then select either
Test or Sweep/Corner
■ Click Templates on any form where it appears (for example, on the Create Test form)
The list of templates available to the current workspace is displayed in the Templates list
area. The following buttons appear on the Workspace Templates form:
Button Description
New Opens the New Workspace Template form (see “Creating a New
Workspace Template” on page 67)
Import Opens the Import Workspace Template form (see “Importing a
Workspace Template” on page 68)
Edit Opens the Setup form associated with one of the following menu
items:
■ Workspace Includes Setup
■ Test Template Setup
■ Sweep Template Setup
■ Corners Template Setup
Delete Removes the selected template from the list of available templates
Copy Copies the selected template, giving the newly copied template the
same name as the original template with a numeric suffix
Tip
Use the Rename button to rename the template.
Rename Highlights the template name in an edit field for modification
DM Presents a drop-down menu of the following Design Management
operations (see “Design Management Operations” on page 73):
■ Update
■ Check Out
■ Check In
■ Cancel
Close Saves changes and closes this form
Help Opens Online Help for this topic
The following right-click pop-up menu items apply to items listed in the Templates list area:
The Properties menu item is used to display the name of and path to the selected template
file in the Properties window (see “Properties Window” in the Virtuoso
Specification-driven Environment Reference). The remaining operations are described
in the table above.
2. For test templates, select a target simulator for the template from the For simulator
drop-down list.
Important
A “… by reference” template cannot be created for a test setup that is targeted for
an ADE-available simulator.
3. Type a name for the new template in the Template name field.
4. Click OK to display the appropriate tabs for the indicated Template type.
Note: When the Template type is “… by reference”, the Includes, Sweep, or
Corners tab is displayed in the Workspace Includes Setup window for specification of
include definitions. When the Template type is “… by copy”, the Template Setup
window is opened for specifying desired starting values for a test, sweep, or corners
analysis.
5. Specify information on the tabs presented.
6. Click OK to create the template and import its name to the list of available templates (in
the Templates list area on the Workspace Templates form).
To import a workspace template from a test, sweep, or corners setup, do the following:
1. On the Import Workspace Template form, select a Template type radio button.
❑ Test includes/Sweep/Corners by reference: The set of test includes, sweep or
corners parameters defined in the template can be referenced by any test, sweep,
or corners setup
❑ Test/Sweep/Corners by copy: The template values can be copied into the current
test, sweep, or corners setup, and used as starting values for the various elements
(for example, include files, simulation options, measures, sweep or corners
parameters)
2. Select one of the Import from radio buttons as follows:
❑ To import a workspace template from a test, sweep, or corners setup in the current
project, select the Import from […] in project radio button and do the following:
a. Select from the available items on the drop-down menu in the field beneath the radio
button. A name matching the selected item is displayed in the Template name field.
b. Optionally, change the name displayed in the Template name field.
❑ To import a workspace template from a test, sweep, or corners setup not in the
current project, select the Import from other […] file radio button and do the
following:
a. Click in the Import from other […] file field to display the Select a […]/Template
File form for navigating to and selecting a test (.tst), sweep or corners (.swp) file,
or a by-reference (.wsinc, .wsinc_swp) template file.
b. Double-click the file name or click Open to select it. A name matching the selected
item is displayed in the Template name field.
c. Optionally, change the name displayed in the Template name field.
3. Click OK to open the appropriate window for the indicated Template type—the
Workspace Includes Setup window for a workspace includes template, or the Test
Template Setup window for a test template.
4. On the Workspace Templates form, click Close.
3
Managing Workspaces and Projects
Items that are checked in (managed) are indicated by a bold M in the lower left corner of the
item’s icon. Items that are managed and checked out (locked) are indicated by a red check
mark in the lower left corner of the item’s icon.
A managed item can be defined as having Strict or Relaxed version control as follows:
Control Description
Strict Exclusive read/write access only
When an item is marked as Strict, it is read-only unless and until it is
checked out. The item is then locked for use (editing) until it is checked back
in again.
Note: Read-only items cannot be modified in the environment, but the Run
action is enabled.
Relaxed Concurrent read/write access
When an item is marked as Relaxed, new versions of the item can be
checked in without requiring a checkout operation first. Typically, the
workspace file (.awf) and results files are marked as Relaxed.
Important
No attempt is made to detect, merge, or otherwise incorporate
intervening changes made to any other local copies.
The set of managed files associated with a test file (.tst), as well as the type of version
control applied to each item, is specified by the site administrator in the
dm_default.config file in $ACV_ROOT/admin.
Note: Even without Design Management, basic file locking is provided to prevent multiple
users from editing/writing the same project files at the same time.
Operation Description
Update Retrieves the latest checked-in version of an item from the repository
and copies it to the workarea; for a project, the Update form is opened
(see “Design Management Update” on page 84)
Note: This operation is available for Relaxed items, or for Strict items
that are not checked out. A Strict item that has been checked out
cannot be updated.
Check Out Performs the Update operation and locks the item or group of items
for exclusive modification by a single user (see “Design Management
Check Out Form” on page 86)
Cancel Cancels the Check Out operation, unlocking the item if locked; no
changes are made to the previously checked-in version
Check In Commits (writes) a new version of the item or group of items to the
repository (see “Design Management Check In Form” on page 74)
Checkpoint Opens the Checkpoint form for creating, restoring, and using
checkpoints (see “Design Management Checkpoint Form” on
page 79)
Delete As a checkin qualifier, marks the item for removal from the
workarea—the effect on the repository copy varies with the
underlying DM system (see “Check In Operation Qualifiers” on
page 78)
Status Opens the Status window for viewing status and version information
for each individual item in a group of items (see “Design Management
Status Window” on page 82)
Libraries For tests only: Provides a mechanism for checking in any
checked-out cellviews associated with the test
The core menu items (Update, Check Out, Cancel, Check In) are available on the following
right-click pop-up menus:
Whether checking in a single item or a group of items, the OK button is used to close the form
after performing the Check In operation. The Cancel button can be used to close the form
without completing the Check In operation. Other related topics include the following:
■ Check In Form for Single Items on page 75
■ Check In Form for a Group of Items on page 76
The following information can be specified on the Check In form for a single item:
Information Description
Checkin description Text string that gets displayed as part of the History (see
“Design Management Tab in Properties Window” on page 81)
Keep checked out Check box to mark the item for immediate checkout following
the Check In operation
Column Information
File Name Name of file, including complete hierarchical path
Tip
The columns on the form can be resized by hovering over the
column separations: When the double arrow appears, click-drag to
adjust the column width. The form size can be adjusted by
hovering over the edges or corners of the window: When the resize
cursor appears, click-drag to adjust the window size.
Operation Drop-down menu of Check In operation qualifiers (see “Check In
Operation Qualifiers” on page 78)
Status Status keyword (see “Status Keywords” on page 79)
Column Information
Version Repository version information (or blank for items that have not been
checked in previously)
Size Size of the file
The Set Operation button is used to perform the operation selected from the drop-down
menu to the left of the button on the list of files displayed in the File Name column (see
“Check In Operation Qualifiers” on page 78). The list of files displayed is determined by which
check boxes are marked ingroup box the Show files group box as follows:
The check boxes in the Scan directories group box, when present (for project checkin), are
used to indicate additional directories to include/exclude when scanning for files to list in the
File Name column:
Important
As the results directory can contain a very large number of files, which are typically
not managed, this check box is unmarked by default in order to save time and
eliminate the clutter of unnecessary file names in the list.
The number of files displayed in the list is indicated in the status bar beneath the column.
Tip
The Check All and Uncheck All buttons in the Show files and Scan directories
group boxes can be used to mark or unmark all check boxes in each group box with
a single action.
The Description field is used to type a text string that gets displayed as part of the History
for each individual file (see “Design Management Tab in Properties Window” on page 81).
Note: The index file is updated with the Description the first time an item is checked in. This
initial description does not change with subsequent checkins, but subsequent descriptions
can be provided to indicate the reason for each new checkin.
Qualifier Description
Cancel Checkout Cancels the checkout operation
Note: The impact of this action on the local workarea copy
depends on the underlying Design Management system.
Check In Checks in the items
Delete Removes the items from the workarea; the effect on the
repository copy varies with the underlying Design Management
system; for example, DesignSync marks the item as inactive,
VersionSync removes all versions and history information for
the item
Status Keywords
The following keywords can appear in the Status column on the Check In form for multiple
files (see “Check In Form for a Group of Items” on page 76) or the Status window (see
“Design Management Status Window” on page 82):
Keyword Meaning
CIN Checked in
COTH Checked out by someone else
COUT Checked out
ERR Is in an inconsistent state
INACT 1 Inactive/deleted
NODM Not managed
UNMAN Unmanageable (or managed by another DM system)
1. This keyword applies only to DesignSync managed items.
Important
A recommended method for tracking checkpoint names is to check in a project or
workspace immediately prior to creating the checkpoint, and specifying the intended
checkpoint name in the checkin description. This description text appears as part of
the History information that gets displayed in the Properties window (see “Design
Management Tab in Properties Window” on page 81).
The Checkpoint form is displayed when Checkpoint is selected from one of the following
Design Management menus:
■ Right-click pop-up menu for a workspace
■ Right-click pop-up menu for a project
For information about these right-click pop-up menus, see “Files Tab” in the Virtuoso
Specification-driven Environment Reference.
The following operations are provided on the drop-down menu in the Operation field of the
Checkpoint form:
Operation Description
Create Applies the checkpoint tag (Name) to all managed files in the
workspace or project
Note: Restrictions on what the Name can be depend on the
underlying Design Management software being used.
Restore – No checkout Updates the local copy of the set of managed files (in the
workarea) from the previously saved checkpoint (Name);
files in the repository are not marked as
Checked out/Locked
Restore – With checkout Attempts to update the local copy of the set of managed files
(in the workarea) from the previously saved checkpoint
(Name) by checking out and locking files from the repository
Note: The current repository version can be updated by
subsequently checking in the restored copy.
Information Description
Name Same as on the General tab (see “Properties Window” in the
Virtuoso Specification-driven Environment Reference)
Path
Version Repository version number for the item as of its most recent Update
or Check Out operation (see “Design Management Operations” on
page 73)
Status Status keyword (see “Status Keywords” on page 79)
History List of all versions of the item, with the most current version at the top
of the list; each item in the list includes its version number, status
keyword, and a brief description
The following columns of information about each item in a set of managed files are displayed
in the Status window:
Item Description
File Name Name of file, including complete hierarchical path
Tip
The columns of the window can be resized by hovering
over the column separations: When the double arrow
appears, click-drag to adjust the column width. The
window size can be adjusted by hovering over the
edges or corners of the window: When the resize cursor
appears, click-drag to adjust the window size.
Status Status keyword (see “Status Keywords” on page 79)
Version Repository version information as of the most recent Update or
Check Out operation (see “Design Management Operations”
on page 73)
Note: The column is left blank for items that have not been
checked in previously.
The list of files displayed in the File Name column is determined by which check boxes are
marked in the Show files group box as follows:
The check boxes in the Scan directories group box, when present (for project status), are
used to indicate additional directories to include/exclude when scanning for files to list in the
File Name column:
Important
As the results directory can contain a very large number of files, which are typically
not managed, this check box is unmarked by default in order to save time and
eliminate the clutter of unnecessary file names in the list.
The number of files displayed in the list is indicated in the status bar beneath the column.
Tip
The Check All and Uncheck All buttons in the Show files and Scan directories
group boxes can be used to mark or unmark all check boxes in each group box with
a single action.
Note: The Print button is used to get a printed copy of the displayed list.
The Update operation (see “Design Management Operations” on page 73) involves
exporting a copy of the currently checked-in version of an item to the local workarea. If the
item already exists in the local workarea, then a warning is issued along with the following set
of options:
■ Cancel the operation for that item
■ Rename the item before copying it to the local workarea
■ Overwrite the item in the local workarea with the copy exported from the repository
Selection Description
Only VSdE project folders Specifies that items in the standard project folders are
to be updated
Note: Items in project folder subdirectories are not
updated. Use the other radio button selection to update
these items.
Entire project directory Specifies that all files in the entire project directory
hierarchy hierarchy are to be updated, including (but not limited
to) workspace files and any results files that were
previously checked in
Additionally, update results Enables (when marked) or disables (when unmarked)
files the Update operation for results files in the set such
that, when the check box is unmarked, results files are
not updated
Note: This check box is available only when the Only
VSdE project folders radio button is selected.
Note: Update is available for Relaxed items, or for Strict items that are not checked out. A
Strict item that has been checked out cannot be updated.
Note: A Relaxed item need not be checked out before a new version can be checked in.
When checking out multiple files, the check box to the left of each File Name in the list is
used to select one, some, or all files for checkout. The Check Selected and Uncheck
Selected buttons can be used to mark or unmark check boxes in selected rows with a single
action.
Tip
A set of adjacent rows can be selected by clicking the mouse in a row and dragging
up or down. Individual rows can be added to or removed from the set by holding
down the Ctrl key and clicking the mouse. All rows can be selected by clicking
anywhere in the File Name heading. All check boxes can then be marked (or
unmarked) by clicking Check Selected (or Uncheck Selected).
■ Copy project from DM and open is selected on the Select Project form (see “Adding
a Project to a Workspace” on page 58)
The set of Workspace or Project files available in the selected Workarea are displayed on
this form. The destination for the local copy of the selected Workspace or Project file is
always relative to the Workarea. The button is used to display a form for navigating to
and selecting a Workarea. The following columns of information are provided in the Source
table:
Button Description
OK Creates a local copy of the selected Workspace or Project at the location
specified in the Destination field; if the item is a Project, then that project is
added to the active workspace
Cancel Closes the Copy from Design Management form without creating a local
copy of the selected Workspace or Project; returns control to the Open
Workspace or Open Project form
4
Understanding Parameters
Project Parameters
Project parameters are global to a project and can be used with any design in that project.
They are used in tests, sweeps, corners, and other tools. For example, a sweep can be used
to simulate the design with various values for the project parameters.
Project parameters are prefixed with $ (for example, $temperature). A project parameter
is global to the entire project such that when its value changes, everything in the project that
is dependent upon that parameter changes. Project parameters and their values are listed in
the Project Parameters table on the Parameters tab in the VSdE main window:
Nearly anything that you need to alter can be defined using a project parameter. For example,
the project parameter $process can be used to specify the model file (for example,
bsim3.mod) for a project. Models can be altered (for example, during a corners analysis)
using project parameters instead of hard-coded model names.
For example:
$vdd_value = "3";
$temperature = "27";
These same tasks are available from the pop-up menu that appears as a result of
right-clicking on Active Project (just above the Project Parameters table):
Note: The additional Properties menu item opens the Properties window (see “Properties
Window” in the Virtuoso Specification-driven Environment Reference).
Any parameter that is declared to be a project parameter (by adding it to the Project
Parameters table) can be swept, used in a test, or used wherever a project parameter might
be required.
Project parameter values can be simple numbers (including numbers with engineering
suffixes—for example, “m” for 1e-3, “u” for 1e-6), text, or expressions that reference other
project parameters. Project parameter values can also reference environment variables using
Perl syntax.
The Add Parameter form is displayed when any parameters that are referenced in any tests
of the project are not found in the Project Parameters table:
Tip
Click in the Param Type column to display a drop-down list: Project, Workspace.
Selecting Workspace as the Param Type causes a parameter to be placed in the
Workspace Parameters table instead of the Project Parameters table.
➤ Click OK to add these parameters and their default values.
Note: The Value column of the Project Parameters table is blank for any parameter that
has multiple variable values (for example, swept parameters), or does not have a default
value.
Workspace Parameters
Workspace parameters belong to a workspace and take precedence over project parameters.
Using workspace parameters, multiple projects can share a common set of parameter values
(for example, $temperature, $process, $vdd_value). Workspace parameters are
grouped together in value sets. A value set is a logical grouping of workspace parameters
that are varied together. For example, a set of parameters for 1.8-volt power and a set of
parameters for 3.3-volt power.
The Workspace Parameters table appears above the Project Parameters table in the
VSdE main window when there is one or more workspace value sets in the workspace.
View - Workspace Value Sets is used to toggle the display of the Workspace
Parameters area (see “View Menu” in the Virtuoso Specification-driven Environment
Reference).
Note: Plans created in one workspace project can run plans in other workspace projects.
Simulation results are grouped by workspace value set name and can be compared using the
Spec Sheet tool.
If a parameter is found in both the Workspace Parameters table and the Project
Parameters table, the value in the Workspace Parameters table takes precedence.
Parameters can be cut or copied from one table and pasted into the other as follows:
1. Select the entire row (parameter and value) from the source parameter table by clicking
in either column and dragging the mouse cursor along the row to the other column.
2. Choose Edit – Cut or Edit – Copy to perform the cut or copy action.
3. Click the Add button beneath the destination parameter table to select the destination for
the cut/copied parameter and value.
4. Choose Edit – Paste to paste the cut/copied value into the destination table.
Note: Simulation results are written to a directory named after the current Workspace Value
Set for the current lib/cell/view, if one exists; otherwise, the results directory is results.
Workspace parameters belong to a workspace and take precedence over project parameters.
Using workspace parameters, multiple projects can share a common set of parameter values
(for example, $temperature, $process, $vdd_value). Workspace parameters are
grouped together in value sets. A value set is a logical grouping of workspace parameters that
are varied together.
Field Contents
Workspace value set name Name of the workspace value set
Copy parameters from Pull-down list of existing workspace sets from which to
copy the initial set of parameters
Note: Selecting “- none -” from this list creates an empty
workspace value set.
The following buttons appear on the New Workspace Value Set form:
Button Description
OK Adds the workspace value set name to the Workspace Parameters table,
which appears above the Project Parameters table in the VSdE main
window when there is one or more workspace value sets in the workspace
Cancel Closes the form without applying any changes
Help Opens the Online Help page for this topic
Item Description/Procedure
Add 1. Type the name of the new workspace value set in the Workspace
Value Set Name field on the New Workspace Value Set form (see
“New Workspace Value Set Form” on page 96).
2. Select a workspace from which to copy the initial set of parameters (or
“- none -” to create an empty workspace value set) from the
drop-down list in the Copy parameters from field.
3. Click OK.
Delete Removes the currently active workspace value set.
Rename 1. Type the new name of the workspace value set in the New
Workspace Value Set Name field on the Rename Workspace
Param Set form:
2. Click OK.
Global Parameters
Global parameters are design variables that are global to the design.
■ For schematic-based tests, global design variables are declared and mapped to project
parameters on the Globals tab of the Test Setup window (see “Globals Tab—Schematic
Designs” on page 115).
■ For standalone netlist-based tests, global parameters are declared and defined
according to methods supported by the target simulator.
Note: For Spectre® circuit simulator tests, global parameters are defined using a
parameters statement in the netlist (see “Parameters Statement” in the Virtuoso
Spectre Circuit Simulator User Guide):
parameters parameter_name = value
a. Click the Add button beneath the appropriate Parameters table in the VSdE main
window (either Project or Workspace).
b. Type the name of the global parameter, prefixed with the Perl $, in the Parameters
column of the table (for example, $vdd_value).
c. Type the value in the Value column (for example, 3).
Whenever the value of the project or workspace Perl variable (for example, $vdd_value) is
changed—for example, during a sweep analysis (see “Sweeps” on page 201)—the value of
the declared global parameter (for example, vdd_value) is also changed. A sweep analysis
works on project parameters and workspace parameters, which can be applied to multiple
tests in the project.
Local Parameters
For the Spectre circuit simulator, a local parameter is specified using a parameters
statement at the beginning of a subcircuit definition (see “Parameters Statement” in the
Virtuoso Spectre Circuit Simulator User Guide). Local parameters can be set for the
DUT as well as for other components in the design (for example, for any design subcircuit that
is added as a component).
5
Test Setup from the Schematic Window
Tests provide stimulus and loads components for a device under test (DUT), and define all
measures and values necessary to obtain a given result. A test can consist of the following:
■ A single design netlist
■ Zero or more components
■ Parameter values for the design, components, and measures
■ Simulation controls/commands (and control values)
■ Net names to which additional components can be connected
■ Connection mapping between components, measures, and the design
■ Zero or more measures
The contents of a test vary depending on analysis types and circuit configurations. Most tests
require the following items:
■ Design netlist, parameterized circuit, or Verilog-AMS behavioral model
■ Voltage/current sources and stimulus components
■ Simulation script containing commands to specify the analysis, options, and measures,
and to set parameters for the design
The following tasks are supported when using the Virtuoso® Specification-driven
Environment (SdE) with the Virtuoso Schematic Editor (see also “Advanced Analyses and
Tasks” on page 104):
■ Creating one or more schematics for a design
■ Starting the SdE on page 105
■ Creating tests to simulate the design (see “Create Test Form—Composer Tests” on
page 107)
■ Running the test (see “Run Tab” on page 150)
The environment further supports the following interactions with the schematic editor:
■ Open a schematic from the environment
■ Select the following items from a schematic:
❑ Net and source selections for analyses
❑ Probe nets and terminals
❑ Measure nets and sources
❑ Stimulus hook-up nets
❑ Nets and terminals for waveform plotting
■ Import and export design variables to/from a schematic
■ Browse cds.lib libraries using library browsers
■ Change the view of a design and run sweeps/corners analyses again using overrides
■ Open the Hierarchy Editor from the Run Options tab in the environment
■ Save a calibrated behavioral model to a library
■ Back-annotate operating point information
Note: A schematic design does not need to be placed anywhere special to work with the
SdE; it can remain in its usual location, pointed to by cds.lib. Designers do not need to
netlist their design because the environment performs this task.
Also, for schematic-based tests, the $IC_START environment variable is automatically set to
the directory where icms (or icfb) was started. This special environment variable can be used
in any field of the Test Setup window (as in $IC_START/models).
Once the environment is started, a new test can be created (see “Create Test Form—
Composer Tests” on page 107). The new test can be based on or copied from an existing test
(see “Copy Test Form” on page 110). Tests are stored in the Tests folder on the Files tab of
the VSdE main window.
Note: The right-click pop-up menu for a test can be used to manipulate the test or its name
(see "Files Tab" in the Virtuoso Specification-driven Environment Reference).
3. Select a workspace or create a new one (see “Creating and Opening a Workspace” on
page 53). If the current schematic does not have a test associated with it already, the
Create Test form appears (see “Create Test Form—Composer Tests” on page 107).
To start the environment from the Command Interpreter Window (CIW), do the following:
1. From the system prompt, start DFII by typing the following (or similar) command:
icms &
2. From the CIW, choose Tools – VSdE to open the VSdE main window.
3. Browse for and select the design for which a test is to be created.
Note: Available libraries are located using cds.lib.
4. Select a workspace or create a new one (see “Creating and Opening a Workspace” on
page 53).
Multiple tests can reference multiple designs, which can all be opened in the environment.
Multiple designs can be added as projects in the same workspace. Sweeps and corners
analyses can be defined for all tests, even if the tests reference different designs.
When the Create Test form appears as a result of starting the environment from the
schematic Window (see “Starting the SdE from the Schematic Window” on page 105), the
lib/cell/view of the schematic is displayed in the Library, Cell, and View fields of the Design
location group box. The default name for the test (displayed in the Test name field) is
derived from the cell name but can be changed. The Browse button is used to change the
design location information.
From the VSdE main window, any of the following methods can be used to open the Create
Test form:
■ Right-click the Tests folder on the Files tab and select New Test from the pop-up menu.
■ Click on the toolbar.
■ Choose Add – New Test.
■ Choose File – New (shortcut = Ctrl-N), then select the Test radio button in the New File
window and click OK.
■ Right-click a test in the Tests folder and select New from the pop-up menu.
Note: This method is available only when there is at least one test in the Tests folder.
Once the Integration type has been selected, the target simulator for the test can be selected
from the drop-down list in the Simulator field.
2. (Optional) Click Templates to open the Workspace Templates form for creating and
managing workspace templates (see “Creating and Managing Workspace Templates”
on page 64).
Tip
Click to open the Browse for Folder window and navigate to the state directory.
Click OK when finished in this window.
Note: The starting point in the Browse for Folder window for state directory is
~/.artist_states/libName/cellName, unless a different root directory is
specified in .cdsenv using the asimenv saveDir setting (for example,
./artist_states).
3. Click OK.
If the name of the dropped/pasted test is the same as another test already in the target folder,
then the Copy Test form is opened:
The target folder can be one of the following in any open project in the main window of any
session:
■ The same as the source folder
■ A like project folder in another open project (for example, a test can only be copied into
a Tests project folder)
■ A like project folder in an open project in another session
Tab Purpose
Design Specify the lib/cell/view location of the test (see “Design Tab—Schematic
Designs” on page 114)
Includes Specify include and model files (see “Includes Tab—Spectre Tests” on
page 191)
Globals Specify global design variables (see “Globals Tab—Schematic Designs”
on page 115)
Components Specify additional component instances for loads and stimuli—for
example, resistors, capacitors, voltage and current sources (see
“Components Tab” on page 130)
Analyses Specify analyses (see “Analyses Tab—Spectre Tests” on page 196)
Sim Options Specify simulation control options (see “Sim Options Tab—Spectre Tests”
on page 198)
Measures Specify measurements for the test (see “Measures Tab” on page 141)
Run Specify run options, run the test, view run status, display measured
results, and plot simulation data waveforms (see “Run Tab” on page 150)
The buttons displayed along the bottom of the Test Setup window are described in the
following table.
Button Purpose
Run Starts a simulation for the completed test setup and displays the Status tab
(on the Run tab) of the Test Setup window (see “Run Tab—Status Tab” on
page 153)
Note: If any changes have been made in the Test Setup window since the last
Apply, the user is prompted with the following choices prior to running the test:
Test values have changed. Do you want to save the changes before
run?
❑ Yes — Apply the changes and run the test.
❑ No — Don’t apply the changes. Run the test using the last-saved Test
Setup settings.
❑ Cancel — Don’t apply the changes; don’t run the test.
OK Saves the entered paths, directories, and current data in the tabbed forms
and closes the Test Setup window
Cancel Exits the current window without saving the data
Apply Saves the current test setup data
Help Opens an Online Help topic for the current window
Note: The button appears dynamically in numeric fields where the Expression Builder
can be applied (it appears after the mouse is clicked somewhere in the field). The Expression
Builder provides an environment for building complex expressions wherever numeric
expressions apply on forms. See "Expression Builder" in the Virtuoso
Specification-driven Environment Reference.
Button Description
Open Design Opens the indicated schematic in the Virtuoso Schematic Editor
Browse Opens the Select Design form for navigating to and selecting a
design (see “Specifying a Design Location” on page 185)
View List Opens the View List form for specifying ordered lists of switch and
stop cell views that contain information that can be simulated
Note: When the netlist is created, these lists are searched in order
until a cell view is found. For more information, see the Cadence
Analog Design Environment User Guide.
Tasks See
■ Get and view variables from a schematic design Design Vars Tab on
page 116
■ Save design variables back to the schematic
■ Specify how schematic design variables are evaluated and
updated in the design
For VSdE Native Spectre tests only (see “Create Test Form— Design Alters Tab on
Spectre Tests” on page 181): page 124
■ Alter instance parameters without requiring design parameters
■ Use PDKs with callbacks in sweeps, corners, etc.
The Value Used is typically a project (or workspace) parameter that is used for the test. The
Current Value is the evaluated value of the design variable.
The Value Used can contain an expression, and that expression can be based on measured
results from another test. For example, the following expression uses the results of another
test’s measure multiplied by two:
“test_name”.measure_name*2
where test_name is the lib/cell/view path to and name of a test, and measure_name is
the name of a measure in that test. For example:
“myLib/myCell/myView/test1”.meas1*2
Important
By default, parameters found in the Value Used column are interpreted as Perl
variables and evaluated prior to netlisting. Single quotation marks can be used
around the entire string to force the Value Used to be passed directly to the netlist
as it is, without any evaluation.
To update the value displayed in the Current Value column after changing the Value Used,
see “Updating the Current Value” on page 118.
The Update button above the Current Value column is used to force a recalculation of the
Current Value displayed, which is useful after changing the Value Used or changing a
project or workspace parameter value on the Parameters tab (see “Project Parameters” on
page 91 or “Workspace Parameters” on page 95).
To update the value displayed in the Current Value column after changing the Value Used,
do one of the following:
➤ Click Apply, or
➤ Click Update (above the Current Value column).
To update the value displayed in the Current Value column after changing a project
parameter value, do the following:
➤ Click Update (above the Current Value column).
To remove a design variable from the Design variables table, do the following:
1. On the Design Vars tab, select a row in the Design variables table.
2. Click Remove.
To read design variables from the schematic (honoring the switch view list) and display them
in the Design variables table, do the following:
1. On the Design Vars tab, click Get Design Vars.
2. Click Apply to open the Add Parameter form (see “Adding Design Variables” on
page 117).
The columns of the Design variables table contain the following information:
Column Contents
Name Design variable name
Value Used Project/workspace parameter used to represent the design variable for
the test (as $Name)
Note: The Value Used can be a constant, a parameter, or a parameter
expression.
Current Value Evaluated value based on the parameter or expression in the Value
Used column
For subsequent uses of Get Design Vars, design variables are read from the schematic and
the following actions are performed:
■ If a design variable is not already listed in the table, then it is added, as above.
■ If a design variable already exists in the table, and the value read from the schematic is
different from the Current Value for that variable, then:
❑ If Value Used is a simple number, then the new value replaces the Current Value.
❑ If Value Used is a parameter or expression, then the new value does not replace
the Current Value. Instead, the New Value and Current Value are displayed on
the Global Parameter Differences form:
To update the Current Value to match the value read from the schematic (the
New Value), the project/workspace parameters must be changed on the
Parameters tab of the VSdE main window (see Chapter 4, “Understanding
Parameters”).
To write the Current Value of the design variables to the schematic, do the following:
➤ On the Design Vars tab, click Save Design Vars.
Note: For ADE designs, the design variables are written to an ADE-specific property,
overwriting anything previously saved there.
The Update design variables with fixed values group box is used to specify how
schematic design variables, and expressions in which they are used, are evaluated and
updated in the design. Replacing design variables and expressions with fixed values is useful
after completing a design, when the design variables and expressions need to be replaced
by fixed values before going to extraction/layout.
Important
Changes are not reflected in the Design variables table on the Design Vars tab.
The Design variables table continues to display the design variables (even though
they are no longer present on the schematic design). Also, the design variables are
not removed from the Test Setup, Sweep, or Corners windows because a test or
sweep can be set up for use by more than one design. However, sweeping a design
variable that is no longer present on the schematic design can cause confusion: the
sweep runs, but the results do not change (because the design now has fixed
values).
Item Description
Evaluate Check box to enable full evaluation of expressions in schematic design
operators variables such that expressions are replaced by fixed values when
Update is clicked
Evaluate suffixes Check box to enable evaluation of scientific suffixes in schematic
design variables (for example, u for 1e-6, etc.) and to display numbers
using engineering notation instead, such that (for example) when
Update is clicked, 50u becomes 5e-5
Item Description
Update Click to perform the indicated (marked) evaluations on schematic
design variables, and to replace those variables with fixed values on
the schematic
Note: The Update Design Variables form appears, with a table of the
design variables and the fixed values evaluated for them, to confirm the
updates. The check boxes in the Update column are used to
enable/disable individual design variable updates. Those variables that
are updated to a fixed value can no longer be swept. The Restore
operation can be used to restore the original design variables or
expressions, which are saved in a property in the schematic editor
(_acvOrigExpr).
View Log Click to view a log file of the schematic design variable changes
performed
Restore Click to restore the original (saved) value expressions
Note: This operation replaces the fixed value with the original design
variable or expression that was saved in the _acvOrigExpr property.
Clean Click to remove the schematic properties that contain the original
saved value expressions
Important
This action cannot be undone, and the Restore operation will
no longer work once the Clean action is performed.
For example, consider a design with the following global design variable:
Selection Description
Evaluate operators length*2u becomes 5u
Evaluate suffixes length*2u becomes 2.5*2e-6
Both length*2u becomes 5e-6
Important
The Design Alters tab is available for VSdE Native Spectre tests only (see “Create
Test Form—Spectre Tests” on page 181).
Design alters are used to alter instance parameters without requiring design parameters to
be defined (on the schematic), and to use PDKs with callbacks in sweeps and corners
analyses.
Design alters (previously saved as a property on each cell) are obtained from the schematic
and displayed in the Design alters table as follows:
Column Information
LCV Instance lib/cell/view
Inst Instance name, displayed as /instance
Name Instance parameter to be altered
Value Used Project (or workspace) parameter that used in the test; by default,
$Param
Note: The Value Used column can contain an expression.
Current Value Evaluated value based on the parameter or expression in the Value
Used column
2. On the Select Instances to Parameterize form, click Select to bring the schematic into
the foreground in selection mode.
3. On the schematic, select one or more instances to be altered. Each instance name
appears in the Instances area on the Select Instances to Parameterize form (for
example, "/M1" "/M3").
Important
All selected instances must have the same Component Description Format (CDF)
instance parameter list, and these parameters must be altered in the same way.
To remove one or more design alters from the Design alters table, do the following:
1. On the Design Alters tab, select one or more rows in the Design alters table.
2. Click Remove.
Tip
Use click-drag to select multiple contiguous rows. Use Ctrl-click to add or remove
additional individual rows to/from the selection set.
Evaluating Callbacks
The Evaluate callbacks check box is used as follows to determine whether or not to
evaluate callbacks during simulation:
■ When the Evaluate callbacks check box is marked, callbacks are evaluated during
simulation for each different parameter value. Alters are used to define parameter values
for simulation.
■ When the Evaluate callbacks check box is not marked, callbacks are not evaluated
during simulation.
Note: Callbacks are always evaluated when updating the schematic (see “Updating the
Schematic” on page 128). It is OK to mark the Evaluate callbacks check box even if there
are no callbacks to evaluate.
Callback procedures are used in PDKs to recalculate parameters that depend on other
parameters. For example, MOS parameters as, ad, ls, ld, and area depend on the value
of the width parameter, w. So, when w gets changed (for example, by editing the instance
parameter on the schematic), these dependent parameters must be recalculated.
When the Design Alters tab is used, the instance parameters are changed on a "working
copy" of the schematic cellview (created in acvScratchLib), and when Evaluate
callbacks is enabled, library callbacks are executed during simulation to recalculate the
values of the dependent parameters. Values read from the working-copy schematic are used
to create the alters.
Note: Library developers should also see “Customizing the List of Instance Parameters for
Design Alters” on page 128.
To update the cell instances on the schematic with the current values of the project
parameters, do the following:
➤ On the Design Alters tab, click Update Schematic.
The list of parameters available on the Instance drop-down on the Select Instances to
Parameterize form (see “Adding Design Alters” on page 126), as well as their default values,
comes from the CDF properties. This list can be customized such that only a particular subset
of parameters is available for selection (for example, w and l of a MOS device, but not ad,
as, ls, or ld, because these are calculated from w and l). In addition, a minimum and
maximum value limit can be defined for each parameter in the selection list.
2. Include a SKILL disembodied property list (DPL) using the following structure (see "Data
Structures: Disembodied Property Lists" in the SKILL Language User Guide):
libInfo=`(
libName (nil
cellName (nil
params (
paramName (nil)
…
)
[procs ([procName …])]
)
…
)
…
)
3. Finally, include a call to acvDefineLibParamInfo using the name of the DPL as the
argument as follows:
acvDefineLibParamInfo(libInfo)
Important
If the SdE context (acv.cxt) has not been loaded, or if the loaded context is not
current for this release, then the acvDefineLibParamInfo call can be wrapped
as follows to prevent a stack trace when loading libInit.il:
if(isCallable('acvDefineLibParamInfo)
then
acvDefineLIbParamInfo(libInfo)
)
For library portability, each libInit.il file typically contains a single libInfo structure
for a single libName. The libInfo structure can contain more than one cellName, and
more than one paramName for each cellName. The procs line is optional and can be
specified as procs nil or left out altogether. When the Evaluate callbacks check box is
marked, callbacks associated with each CDF parameter are executed. Any procedures called
using procs are called in addition to the parameter callbacks, and without arguments.
)
procs (myPrimLibCalcDependentRes)
)
)
)
acvDefineLibParamInfo(libInfo)
Components Tab
The Components tab is used to select and configure components for a test. The set of
components available depends on the set of libraries configured.
The Add button is used to add components from the configured libraries to the test. Available
libraries (for example, spectre_prim, worklib, builtins) are displayed in the drop-down list in
the Library field on the Add Design Component form, which is presented when Add is
clicked.
Instance names for components added to a test appear in the Components list area.
Connections between component pins and design nets are made in the Pin/Net table (by
selecting a net from the drop-down list in the Net column for each component pin in the Pin
column). The fields and tables that appear on the tab depend on the component selected in
the Components list.
Standard buttons on the Components tab and their functions are outlined in the following
table:
Button Description
Add Opens the Add Design Component form for adding a component to the test
(see “Add Design Component Form” on page 132)
Delete Removes the selected component from the test
Copy Copies the selected component, presenting the Set Instance Name form for
naming the new instance added to the test
Rename Displays the component instance name in an edit field so that the name can
be modified
Note: A shortcut for clicking Rename is to double-click the component
instance name in the Components list area.
The following fields are available on the Add Design Component form:
■ Library: selectable from a pull-down menu of built-in and customized user-supplied
libraries
■ Cell, View, and File which are filled in when a design component is made selected in
the list field below the File field.
The buttons on the Add Design Component form are described in the following table:
Button Description
Add Opens the Set Instance Name form:
Item Description
Pin/Net table Lists pins of the selected component (in the Pin column) and
allows connection of the design nets to these pins by selecting a
design net from the drop-down list in the Net column for each pin in
the Pin column
Note: You can use the Select button in the Schematic column to
select a net on the schematic. For non-schematic nets, the net name
must not have the forward-slash escape character (/) at the front.
The slash indicates a schematic net name that must be mapped to
a netlist name. If the escaped name is not found (in the amap
directory), it is mapped to the empty string.
Item Description
Subckt field Contains the name of the SPICE subcircuit, from the File indicated
in the Component location fields, whose parameters and pins are
displayed on the Components tab
Parameters table Lists the parameters of the selected component, including
parameter descriptions, their current and default values, and the
units of the parameters. The fields in the Value column can be
modified
Component Provides fields for displaying the Library, Cell, View, and File
location group box name information for the selected component instance
Edit button Used to open the File, indicated in the Component location area,
in a text editor window
The fields, tables, and buttons that appear on the Components tab depend on which
component is selected.
The analog piecewise linear components in the builtins library require monotonic time-value
pairs (time-voltage for vpwl_analog and time-current for ipwl_analog), which are specified
by adding rows to the Time/Voltage (or Time/Current) table and typing values in the
appropriate columns. The buttons operate as described in the following table.
Item Description
Add Row Adds an empty row after the last row
Insert Row Inserts an empty row beneath the selected row
Delete Row Removes the selected row and its contents
The analog piecewise linear from-a-file components in the builtins library require monotonic
time-voltage (or time-current pairs, for ipwl_file) in a file, expressed in columns of data which
can be separated by spaces or tabs. Any blank lines in the file are ignored. Lines containing
either a * or a # in the first column are assumed to be comments and are also ignored.
The fields in the Parameters group box are used as described in the following table.
Item Description
Filename Name of file containing the PWL time-voltage (or time-current for
ipwl_file) pairs
Tip
Click to open the Select a vpwl or ipwl file form for
navigating to and selecting a file.
Note: If Filename is a simple file name without a path, then the current
directory is searched first, followed by any directories specified in the
-I: Include directories list area on the Netlist Files tab on the
Includes tab of the Test Setup window (see “Netlist Files—Spectre
Tests” on page 192).
Initial delay Time to delay the start of the PWL
Multiplier Multiplier
The digital piecewise linear components in the builtins library are used to build analog
representations of digital PWL stimuli and require the following information.
In the Stimulus table, values (1’s and 0’s) can be either typed or defined using the following
buttons:
Button Description
Invert Inverts the entire stimulus; all 1’s become 0’s and all 0’s become 1’s
Repeat Repeats the last value
Toggle Inserts the inverse of the last value; inserts a 0 when 1 is the last value, or 1
when 0 is the last value
Note: If the Stimulus field is empty, clicking Invert inserts a 1 as the first value in the field.
The fields in the Parameters group box are used to define the PWL stimulus as described in
the following table.
Field Description
High value High analog value for the PWL waveform
Low value Low analog value for the PWL waveform
Interval Duration of each of the PWL high-value or low-value segments
Rise time Rise time for changing from Low value to High value
Fall time Fall time for changing from High value to Low value
The Repeat stimulus check box is used to enable/disable PWL repeat mode, which repeats
the specified PWL stimulus indefinitely.
The fields in the Parameters group box are used to define staircase components as
described in the following table.
Field Description
Initial delay Time (from time-zero) to wait before taking the first step
Start value Voltage or current value at which to start the staircase
Note: This value is held from time-zero through Initial delay.
End value Voltage or current value at the top of the staircase
Step count Number of steps to go from Start value to End value, after Initial delay
Field Description
Step time Time from the start to the completion of each step (including Step slope
time)
Step slope Time it takes to get from the top value of the last step to the top value of the
next step
Measures Tab
Measurement functions, which operate on simulation results, are specified and set up on the
Measures tab of the Test Setup window. The appearance of the tab depends on the type of
measure being added. Also, when a measure name is highlighted in the Measures list, the
remaining items on the tab display information about that measure.
The set of measurement functions that are available to the test are listed in the Measures list
box. Those marked with check marks are performed when the test is run. Measures in a test
must be ordered such that any measure whose result is used in another measure comes first
in the Measures list.
Tip
Use the right-click pop-up menu in the Measures list area to Check All or Uncheck
All measure check boxes. This right-click menu offers choices for the following
actions: {Add, Rename, Delete, Move Up, Move Down, Check All, Uncheck
All}.
The following set of buttons applies to all measure types, unless otherwise indicated:
Buttons Description
Add Presents a submenu of measure types (see “Adding a Measure” on
page 143)
Delete Removes the selected measure from the test (see “Deleting a Measure” on
page 148)
Copy Copies the selected measure (see “Copying a Measure” on page 148)
Rename Renames the selected measure (see “Renaming a Measure” on page 149)
Moves the selected measure up or down in the list, as indicated by the arrow
Help For OCEAN and MATLAB measures only, a Help button appears to the right
of the Measures list and is used to display help for the selected measure
See also “Creating Results from Other Measured Results” on page 149.
The following check boxes are available for all measure types, unless otherwise indicated:
Adding a Measure
The following measure types are supported, as presented on the Add submenu:
Note: If a measure requires a particular analysis (or analyses), and the required analysis has
not been specified, then the environment displays a form to disable the measure. For
example:
When Add – ADE Calculator is selected, the Measures tab is configured for specification
of a measurement expression using the Calculator. Which Calculator window appears
depends on the Default for Calculator and Direct Plot selection in the Defaults group box
on the Other tab of the Options window (see “Defaults Group Box” on page 242). The
following buttons are displayed beneath the text area in the ADE Calculator Expression
group box:
Button Description
Calculator Opens the Calculator window for developing measures
Get Calc Expr Imports the expression from the Calculator into the text area in the
ADE Calculator Expression group box
Direct Plot Opens the Direct Plot Form, the design schematic window, and either
a Waveform Window or a Graph Window for plotting simulation results,
depending on the Default for Calculator and Direct Plot selection in
the Defaults group box on the Other tab of the Options window (see
“Defaults Group Box” on page 242)
Note: For information about the Direct Plot Form, see Chapter 10,
"Plotting and Printing,” in the Cadence Analog Design
Environment User Guide.
DP -> Calc Imports the plot expression from Direct Plot into the Calculator
Get DP Expr Imports the plot expression from Direct Plot into the text area in the
ADE Calculator Expression group box on the Measures tab
Alternatively, any valid Calculator expression can be typed directly into the text area in the
ADE Calculator Expression group box.
When Add – OCEAN Script is selected, the Measures tab is configured for specification of
a measurement function built using the OCEAN scripting language. OCEAN statements can
be typed in the text area in the OCEAN Script group box on the Measures tab. For example:
crosstime=cross(VT("/clk"),0.5,0,"either")
unless(crosstime==nil acvMeasResult=crosstime)
When typing an OCEAN script measure in the text area provided, observe the following
guidelines:
■ Getting Waveform Data on page 146
■ Writing Error Messages on page 146
■ Saving Measured Results on page 147
Calculator functions are used for getting data out of the simulation results file. For example:
crosstime=cross(VT("/clk"),0.5,0,"either")
Important
For components that are specified on the Components tab rather than on the
schematic (see “Components Tab” on page 130), the name specified as the
argument to the Calculator function must be the simulator name instead of the
schematic name. The schematic name begins with a slash (/) and has a slash
between the component name and any port name. The simulator name depends
on the target simulator. For the Spectre circuit simulator, the simulator name does
not begin with a slash, and has a colon (:) between the component name and any
port name. For example, the current through a DC source defined on the
Components tab is specified as IT("vdc_port:p") when Spectre is the target
simulator, whereas the same current when the DC source is on the schematic is
specified as IT("/vdc_port/PLUS").
In order for error messages from measures to be recognized when the OCEAN script is run
from the environment, they should be prefixed by Error:. For example:
printf("Error: abcd %L must be >0\n",abc)
The following methods are supported for saving measure data to the result database:
■ acvMeasResult variable: For custom measures that return one measurement result,
the value (or list of values) can be assigned to the acvMeasResult variable. The value
can be a waveform. The OCEAN code wrapped around the measure recognizes this
variable and saves the value or values to the result database using the measure’s
instance name as a key.
■ acvMdumpRes function: For custom measures that generate more than one result, each
result value can be saved to the result database by calling the acvMdumpRes function
using one of the following forms:
Using the result name as an identifier:
acvMdumpRes("resultName " resultValue);
Using the measure instance name and result name as an identifier:
acvMdumpRes("instanceName:resultName " resultValue);
Applying an optional suffix to the result name as an identifier:
acvMdumpRes("resultName " resultValue optionalSuffix);
Note: In the case of a nil result, ***** is written to the results file.
Here are some examples of the acvMdumpRes function, showing the format of the results as
they are written to the results file:
Copying a Measure
To copy a measure, do the following:
1. On the Measures tab, select a measure from the Measures list.
2. Click Copy.
The Set Measure Name form appears:
3. In the Measure name field, type a name for the measure as it is to appear in the test.
4. Click OK.
The measure name appears in the Measures list area on the Measures tab. The check box
for the added measure is marked by default.
Deleting a Measure
To delete a measure from the test, do the following:
1. On the Measures tab, select a measure from the Measures list.
2. Click Delete.
To delete a measure from the test using the right-click pop-up menu, do the following:
1. In the Measures list, right-click a measure.
The pop-up menu appears.
2. Select Delete.
Renaming a Measure
To rename a measure in the test, do the following:
1. On the Measures tab, select a measure from the Measures list.
2. Click Rename.
The measure name is selected for editing.
3. Type a new name or edit the existing one.
4. Press Return to accept the changes or Esc to cancel the edit.
For example:
my_new_meas = my_ocn_meas + 4.4
acvMdumpRes("my_new_meas_result" my_new_meas)
Important
In order for one measured result to be used in another measure’s expression, the
referenced result must be defined before the new expression. The order of the
measures can be changed by selecting a name in the Measures list and using the
up and down arrow buttons (see “Measures Tab” on page 141). Also, the Ending
OCEAN commands are executed last.
Run Tab
The Run tab of the Test Setup window features the following sub-tabs:
Tip
The right-click pop-up menu for the test offers the following additional selections for
schematic editor users: Open Design (for opening the design in the schematic
window) and Create/View Netlist (for displaying the design netlist in a read-only
text editor window).
Running a test causes one or more simulation files to be generated, and the output results
data is captured. A single simulation is run, for each analysis specified, using the test values
and any project or workspace parameters referenced in the test. Pin connections are checked
when the test is run. Unconnected pins can be resolved on the Net Mapping form (see “Using
the Net Mapping Form” on page 189).
where x represents the run number (for example, run0). For example, results for lib/cell/view
PllLIB/vcoV2F/schematic, test name vcoV2F, run 0, are written to the following
directory:
results/PllLIB/vcoV2F/schematic/vcoV2F/run0
The final netlist is also named according to the run number as follows:
testName_x.va
where x represents the run number. For example, the netlist file for the vcoV2F test, run 0,
is named as follows:
vcoV2F_0.va
The Status tab on the Run tab of the Test Setup window is displayed when any of the
following actions is performed:
■ In the Test Setup window, click Run to run the test.
■ Select a test in the Tests folder on the Files tab of the VSdE main window, and choose
Run – Run Test.
■ Right-click a test in the Tests folder and select Run from the pop-up menu.
Field Description
Status Displays the percent done during the run, and the final status at the
end of the run (passed or failed)
Simulator Displays the simulator used for the run (for example, Spectre)
Host Displays the host name of the computer on which the test is run
Start Displays the start date for the run
Start time Displays the starting time for the run
Elapsed time Displays the elapsed (clock) time for the run
CPU time Displays the CPU time used for the run
Memory used Displays the memory used for the run
Output files size Displays the disk space allocated for the output files
Run dir Displays the run directory for the test; for example:
results/lib/cell/view/testName/run0
Button Description
View Netlist Opens the netlist for the test in a read-only text editor window
View Script Opens the simulation control command file (script) for the test in a
read-only text editor window
View Output Opens the simulator output messages file in a read-only text editor
window
View Errors Opens the simulation error messages file in a read-only text editor
window
View Other Opens the View Run File form for selecting a run file to display in a
read-only text editor window
Open Xterm Opens an xterm window in the simulation run directory
Note: The xterm command is specified using the ACV_XTERM
environment variable whose default value is:
ACV_XTERM = xterm -sb -sl 500 -geometry 100x30
Button Description
Calculator Opens the Calculator window for developing measures
Note: Which Calculator window appears depends on the Default for
Calculator and Direct Plot selection in the Defaults group box on the
Other tab of the Options window (see “Defaults Group Box” on page 242).
Plot Plots the selected measure according to the selection from the drop-down
menu:
❑ Clear, then plot—Clears any traces currently plotted in the
current waveform window, then plots the selected item or items.
This is the default setting.
❑ Plot in new window—Opens a new waveform window and plots
the selected item or items. This window becomes the current
window.
❑ Overlay in current plot—Plots the selected item or items as an
overlay in the current plot, leaving whatever traces might already
be there intact.
Note: If AWD is not already running, then selecting an item from the Plot
drop-down menu opens a waveform window and plots the selected item.
Format Displays the Format tab of the Cell Formatting window (see “Formatting
Table Cells” on page 260) for formatting the selected row or rows of data
Tip
Use click-drag to select multiple cells.
DP -> Calc Imports the plot expression from Direct Plot into the Calculator
Direct Plot Opens the Direct Plot Form, the design schematic window, and either a
Waveform Window or a Graph Window for plotting simulation results,
depending on the Default for Calculator and Direct Plot selection in the
Defaults group box on the Other tab of the Options window (see “Defaults
Group Box” on page 242)
For information about the Direct Plot Form, see Chapter 10, “Plotting and
Printing,” in the Cadence Analog Design Environment User Guide
Button Description
Op Point Opens the Operating Point window if there are any operating point data
files for the test (see “Analyzing Operating Point Information” on page 263)
View Presents a drop-down menu of the following selections (see the Cadence
Analog Design Environment User Guide):
■ Mismatch Summary
■ Noise Summary
■ Noise Parameters
■ Stability Summary
■ Pole-zero Summary
■ Sensitivity
■ Advanced Analysis
Note: The Hide domain data check box—which only appears for results files that contain
domain data (for example, times for transient analysis, or frequencies for AC analysis)—can
be marked to cause domain data not to be displayed on the Results tab.
Additionally, the following buttons are available on the upper right corner of the tab for
exporting and printing measured results:
Button Description
Exports the measured results data to an HTML file in the docs directory for
the project, named as follows:
Measures_libName-cellName-viewName-testName .html
where libName, cellName, viewName, and testName uniquely identify
the results.
Opens the Print Setup form (see also "Printer Setup" in the Virtuoso
Specification-driven Environment Reference)
Opens the Save As form for selecting a name and location for the file of
comma-separated values to which to save the data
The Auto-plot check box (upper right corner of the form) is used to enable/disable automatic
plotting of those signals for which the Plot check box is marked, and all measure results, upon
successful completion of the simulation.
Note: Waveforms from different analysis types are plotted in separate windows. All measure
results are plotted in a single window.
The information for this form is discussed in the following sections of “Viewing Waveforms” on
page 278:
When multiple analyses have been run, the Plot analysis drop-down menu is used to specify
which analysis results are to be plotted. When the Plot analysis type is AC, XF, or SP, the
Modifiers selection list appears in the lower left corner of the tab. The list of modifiers
depends on which Plot viewer is selected. For example, for AWD, the Modifiers are as
follows:
Selection Plots
DB20 & Phase db20 and phase of the selected net or terminal
DB10 & Phase db10 and phase of the selected net or terminal
Mag & Phase Magnitude and phase of the selected net or terminal
Real & Imag Real and imaginary parts of the selected net or terminal
DB20 db20 of the selected net or terminal
DB10 db10 of the selected net or terminal
Selection Plots
Mag Magnitude of the selected net or terminal
Phase Phase of the selected net or terminal
Real Real part of the selected net or terminal
Imag Imaginary part of the selected net or terminal
When the Plot analysis type is XF, the Signals to plot area becomes the Instances to plot
area.
When the Plot analysis type is SP, the Signals to plot area becomes the S-Parameters
to plot area, and the Plot types selection list appears next to the Modifiers list:
Selection Description
Auto Plots the data using an appropriate plot type
Rectangular Plots the data in a rectangular coordinate system
Z-Smith Plots impedance data in a Smith chart
Y-Smith Plots admittance data in a Smith chart
Polar Plots the data in a polar chart
6
Creating Tests for an ADE Simulator
Important
Some ADE netlisting functionality will not work if the netlisting mode is not set to
analog prior to opening the environment. For icms, the netlisting mode is
automatically set to analog; for other applications, the CDS_Netlisting_Mode
environment variable must be set to Analog:
setenv CDS_Netlisting_Mode Analog
The Test Setup window appears (see “Test Setup Window—ADE Tests” on page 166).
2. Click the appropriate buttons on the ADE tab to display the necessary ADE forms—for
example, clicking Analyses causes the ADE Choosing Analyses form specific to the
selected simulator (for example, spectre) to be displayed; similarly, clicking Simulation
Options causes the ADE Simulator Options form for the targeted simulator to be
displayed.
Note: See “BUTTON” on page 172 for information about how to customize the set of
buttons and the ADE forms they display.
3. Click Apply on the ADE forms to save changes, or OK to save changes and close the
forms.
Each time the test is opened, all forms are initialized with the saved test and state file
information. Whenever any of the ADE setup information changes, the OCEAN script is
displayed when the test is saved.
When the test is run, the generated script performs the following tasks:
■ General setup
■ Runs the OCEAN script (to run the target simulator)
■ Evaluates measures
Note: Measures created using the Calculator are added to the OCEAN script.
For example, to specify the section name from the .scs library as a parameter which can be
varied, do the following:
1. On the ADE tab, click Model Libraries to display the ADE Model Library Setup form.
2. Type or browse for the model library file, and in the Section (opt.) field, type
#section#.
3. Click Add and OK to save the changes and close the form.
4. In the Test Setup window, click OK to save changes and close the window.
To set up a corners analysis over the section choices from the .scs library, do the following:
1. On the Corners tab, click Add Param to add a column for the $section parameter.
2. In the Corners parameters table, create corners for each of the section names.
For example:
When the environment runs the analysis, the section parameter (which appears as
#section# on the ADE form and $section in the Project Parameters table in the
environment) is varied through the specified values.
Note: The default.config file contains customization information that is used when
a simulator_name.config file cannot be found.
2. Copy the customization file to $ACV_ROOT/admin.
3. Open the environment.
Section Description
FILE_VERSION Contains customization file version, create date, and simulator
name information (see “FILE_VERSION” on page 171)
BUTTON Contains a list of button names that appear on the ADE tab,
callback function names for the forms that get displayed, and
tooltip text for each button (see “BUTTON” on page 172)
ANALYSIS_TYPE Contains a list of PSF file names associated with each analysis
type, and the result type stored in each file (see
“ANALYSIS_TYPE” on page 173)
OPPTNV_SETUP Contains the name of the operating point node voltage
annotation setup file (see “OPPTNV_SETUP” on page 174)
VIEW_NETLIST_RUN Contains a list of names and/or extensions of the files in the run
directory to display when the View Netlist action is performed
from the Status/Job Status tab (see “VIEW_NETLIST_RUN”
on page 175)
VIEW_NETLIST_TEST Contains a list of names and/or extensions of the files in the
project/tests directory to display when the View Netlist
action is performed from the right-click pop-up menu for a test or
from the Run menu in the VSdE main window (see
“VIEW_NETLIST_TEST” on page 175)
EXPAND_DOT Contains a list of state file names and the fields in that file which
can contain a relative path that the environment needs to
expand—for example, ./BiCMOS (see “EXPAND_DOT” on
page 176)
STATUS Contains the location and name of the file containing simulation
status information, so that this information can be displayed on
the Test Console tab (see “STATUS” on page 177)
Important
The simulator_name.config customization file must be copied to
$ACV_ROOT/admin prior to opening the SdE.
FILE_VERSION
The following keywords appear in the FILE_VERSION section of the customization file:
BUTTON
A standard set of ADE forms (for specifying simulation information—for example, analysis
setup information) can be displayed from the ADE tab of the Test Setup window using the
buttons provided on that tab. The set of buttons and the ADE forms they display can be
customized by specifying the button text, tooltip text, and schematic editor command for
displaying each ADE form in the BUTTON section of the simulator_name.config
customization file as follows:
buttonName, ADEFormCallbackFunction, tooltipText
where
buttonName is the text that appears on the ADE tab button in the Test Setup window
ADEFormCallbackFunction
is a valid ADE form callback function name, which is the same as what is
used in the simulator_name.menus file for customizing ADE
tooltipText is the tooltip text that appears when the mouse cursor hovers over the
button
ANALYSIS_TYPE
The ANALYSIS_TYPE section of the customization file contains analysis and results file
mapping information in the following format:
SimulatorAnalysis, OCEANanalysis, DisplayName, psfFileName, resultType
where
For example, the ANALYSIS_TYPE section of a customization file might be written as follows:
ANALYSIS_TYPE {
# Simulator OCEAN(script) Display psf file Result
# analysis analysis name name Type
#
tran, analysis(’tran, Transient, timeSweep, wave
ac, analysis(’ac, AC, frequencySweep, wave
dc, analysis(’dc, DC, srcSweep, wave
noise, analysis(’noise, Noise, noise, wave
op, ?saveOppoint t, DC OpPt, opBegin, oppt
tranOp, analysis(’tran, Tran Final OpPt, opEnd, oppt
}
OPPTNV_SETUP
The OPPTNV_SETUP section of the customization file is used to specify the name of the
initialization (.ini) file for annotation of operating point node voltages as follows:
opPtNvSetup.ini
where opPtNvSetup is the name of the .ini file, which must be copied to
$ACV_ROOT/admin prior to opening the environment.
For example, the ANALYSIS_TYPE section of a customization file might be written as follows:
OPPTNV_SETUP {
defaultOpPtSetup.ini
}
The initialization file contains a list of operating point parameters (for example, v, i, pwr) to
back-annotate for each device. For example, the defaultOpPtSetup.ini file might
contain the following:
…
resistor(
annotate(
v
i
pwr
)
)
capacitor(
annotate(
i
)
)
…
bjt(
annotate(
ic
ib
vbe
vbc
vce
)
)
…
VIEW_NETLIST_RUN
The VIEW_NETLIST_RUN section of the customization file contains the list of file names
and/or file extensions (in the run directory) to be displayed when the user performs the View
Netlist action from the Status/Job Status tab: For tests, it is the Status tab (on the Run tab
of the Test Setup window); for sweeps/corners, it is the Job Status tab of the Sweep or
Corners window.
For a simulator, the design netlist can be contained in a single file, or it can be contained in
more than one file. All files matching the criteria specified in the VIEW_NETLIST_RUN section
of the customization file are displayed. Multiple files are combined into a single file called
netlist.all in the run directory, and netlist.all is displayed.
VIEW_NETLIST_TEST
The VIEW_NETLIST_TEST section of the customization file contains the list of file names
and/or file extensions (in the project/test directory) to be displayed when the user
performs the View Netlist action from the right-click pop-up menu for a test or from the Run
menu in the VSdE main window.
For a simulator, the design netlist can be contained in a single file, or it can be contained in
more than one file. All files matching the criteria specified in the VIEW_NETLIST_TEST
section of the customization file are displayed. Multiple files are combined into a single file
called netlist.all in the run directory, and netlist.all is displayed.
EXPAND_DOT
The EXPAND_DOT section of the customization file is used to specify a set of state files and
the fields in those files that can contain relative paths that the environment needs to expand
(for example, ./BiCMOS). The format of each line in the EXPAND_DOT section depends on
whether the field name is one or more words.
Note: For a state file that does not contain a field name, an asterisk is used instead:
stateFileName, *
For example, the EXPAND_DOT section of a customization file might be written as follows:
EXPAND_DOT {
modelSetup, *
}
Another example is
EXPAND_DOT {
simulationFiles, includePath
simulatorOptions, (verilogOpts libraryDir)
simulatorOptions, (verilogOpts libraryFile)
simulatorOptions, (verilogOpts commandFile)
simulatorOptions, (verilogOpts optionsFile)
}
STATUS
The STATUS section of the customization file is used to specify the directory and file in which
to find simulation status information, such as CPU and elapsed time, using the following
format:
tag, directory, fileName, "string"
where
For example, the STATUS section of a customization file might be written as follows:
STATUS {
CPU = psf spectre.out "CPU = %s %s"
ELAPSED = psf spectre.out "elapsed = %s %s"
}
7
Creating Tests for Spectre Simulation
8. Click OK.
The Test Setup window appears (see “Test Setup Window—Spectre Tests” on page 186).
To select a design netlist from a library location that is not listed in the Library drop-down list,
do the following:
1. On the Select Netlist form, click Select File.
The Select Netlist File form appears.
If there are local or global parameters specified in the input design file (netlist), these
parameters, their defaults, and their editable values are displayed in the Local or Global
parameters table on the Netlist tab of the Test Setup window, respectively (see “Netlist
Tab—Spectre Tests” on page 187).
The Library List button is used to open the Library List Manager form containing a list of
available libraries as follows:
Column Contents
Library Name Name of library
Note: This name can be anything and does not have to correspond
to an actual directory location.
Library Map String Full path to library directory location
Note: This location must exist and can contain environment
variable representations of path segments.
Library Location Fully expanded form of Library Map String (with true path
substituted for any environment variables)
Note: This field cannot be edited.
Library Map File Name of library map file from which the indicated library
information was read—for example, acv.lm, default_acv.lm,
cds.lib (for schematic editor users)
Note: Any changes made on this form are written to the local
acv.lm file for the project/workspace. See also “Library Mapping
File” in the Virtuoso Specification-driven Environment
Reference
Button Description
Adds a new row to the table
Deletes the selected row
The Netlist tab is used to display information about the design netlist.
The Local parameters and Global parameters tables of the Netlist tab contain lists of
local and global parameters, respectively, if any are found in the design netlist. The
columns of the tables are
■ Parameter—contains the name of the parameter
■ Value—contains the currently assigned value of the parameter
■ Default—contains the default value of the parameter
Note: Only the currently assigned value (in the Value column) can be changed.
Local parameters (see “Local Parameters” on page 101) are defined using a parameters
statement within a subcircuit.
Global parameters (see “Global Parameters” on page 101) are defined using a parameters
statement at the top level of the design (outside all subcircuits).
Note: For information on the parameters statement, see “Parameters Statement” in the
Virtuoso Spectre Circuit Simulator User Guide.
The following items are available on the Netlist tab of the Test Setup window:
Item Purpose
Browse button Opens the Select Netlist form (see “Specifying a Netlist
Location” on page 182)
Edit button Opens the indicated netlist design file in an editing window for
modification
Top subckt/module and Displays the subcircuit or module name along with its instance
Instance name fields name as found in the design netlist
Note: If the indicated file contains more than one subcircuit or
module definition, then the top definition can be selected from
the drop-down list of names in the Top subckt or Top module
field.
Net Map button Opens the Net Mapping form (see “Using the Net Mapping
Form” on page 189)
The Net Mapping form displays an overview of mapped/connected nets and can be used to
add new nets/connections:
Note: Typically (and recommended), net mapping functions are carried out on the
Components tab (see “Components Tab” on page 130) or the Measures tab (see
“Measures Tab” on page 141).
When a netlist is specified for a test, the nodes of the Top subckt are used to create a set of
net names, which are displayed in the Test nets list box on the left side of the Net Mapping
form. When a component is added to a test, its pins are added to the Component pins list,
on the right side of the form. Double-clicking a net or pin name, or clicking the + to its left,
expands the list of connected items.
Note: Only nets that are connected have a +. Unconnected nets are listed without a +.
A test net can be connected to one or more component pins (either by dragging a net over to
a pin, or by selecting a net and one or more pins and clicking Connect). Test nets can be
added, removed, or renamed.
Note: Pin connections are checked when the test is run (see “Run Tab” on page 150).
List area tool buttons for these tabs are used as follows:
Tip
Double-clicking in the list area produces the same result.
Removes the selected item from the field
Re-orders the items in a list area by moving the selected item either up or
down, as the arrow indicates
Tip
Pressing Alt and the up arrow or Alt and the down arrow produces
the same result.
Displays the Open form (for files) or the Browse for Folder window (for
directories) from which a file or directory folder can be selected
Tip
Move the mouse so that the cursor hovers over a button to view its tooltip.
Additionally, if there are any test includes templates available (see “Creating and Managing
Workspace Templates” on page 64), then includes information from a template can be added
to the test by selecting the template from the Workspace includes drop-down menu.
The Netlist Files subtab of the Includes tab for Spectre looks like this:
The Other Includes subtab of the Includes tab for Spectre looks like this:
The Beginning, Pre-run, and Ending OCEAN command areas on the OCEAN
Commands subtab of the Includes tab for Spectre can be used to override Test Setup
information, or to perform special post-processing of output files after each simulation run.
For each run of the test with a given set of parameter values (that is, for each iteration), the
final simulation script will contain the following commands, in the order presented here:
■ Beginning OCEAN commands
■ Commands specifying alters, simulation options, analyses, etc.
■ Pre-run OCEAN commands
■ Run command and measures
■ Ending OCEAN commands
Spectre or SpectreRF simulation setup is enabled by selecting one of the radio buttons
along the top-left edge of the Analyses tab. The set of analysis types available for Spectre
or SpectreRF simulation is displayed in the list area beneath the radio buttons.
Each analysis type displayed in the list area is matched with a check box which is used to
enable/disable the analysis. A mark in the check box enables the analysis. The appearance
of the tab depends on the analysis type—and in some cases the options (for example, XF
analysis, Sweep variable selection)—selected. An analysis type can be selected without
being enabled.
Note: For information on Spectre analyses, options, and settings as they pertain to the tabs
and forms provided in the environment, see the Virtuoso Spectre Circuit Simulator User
Guide and the Virtuoso Spectre Circuit Simulator Reference. For information on
SpectreRF analyses, options, and settings, see the SpectreRF User Guide. The Help
button on each form displays the same text as spectre -h.
In addition, nodesets and initial conditions can be specified, and operating point information
saved, by clicking Nodesets/ICs to open the Analyses Nodesets and Initial Conditions form:
Operating point nodeset information and initial conditions for simulation are specified as
node-voltage pairs using one of the following methods:
■ Click Select to bring the schematic window into the foreground in selection mode, then
click to select one or more nets on the schematic. The name of each net appears in the
Nodes column of the table. Type a voltage value (or variable expression) for each net in
the corresponding cell in the Voltages column.
■ Click Add to add a row to the table, then type the node name in the Nodes column and
the voltage value (or variable expression) in the Voltages column.
Specified nodesets and initial conditions are only used during simulation when the
appropriate check box is marked on this form:
■ When the Specify and use Nodesets check box is marked, the nodeset information
specified in the corresponding Nodes and Voltages table is used during simulation.
■ When the Specify and use ICs check box is marked, the initial conditions specified in
the corresponding Nodes and Voltages table are used during simulation.
The Delete button is used to remove the selected row or rows from the table. Click OK to
save changes and close the form.
The Temperature parameters group box contains the following simulation options:
The items in the Nodes/currents save list group box are used to specify which nodes and
currents are saved to the simulation results file:
■ The Save allpub nodes and Save all currents check boxes are used to save all public
node and/or current information.
■ The Select button is used to select specific nets or terminals from the schematic.
Alternatively, specific nodes and currents to be saved to the simulation results file can be
specified in the scrolling list area in the Nodes/currents save list group box. For example:
■ To save net24, specify net24
■ To save the current through VDD_source.vdc, specify VDD_source.vdc:i
Note: Nets/currents are written to the simulation control file using the Spectre save
command. See the Virtuoso Spectre Circuit Simulator User Guide for more information.
The Advanced button is used to open the Advanced Sim Options form for specifying
advanced simulation options.
Note: For information on Spectre circuit simulator options and settings as they pertain to the
tabs and forms provided in the environment, see the Virtuoso Spectre Circuit Simulator
User Guide and Reference. The Help button on each form displays the same text as
spectre -h.
8
Creating and Running Advanced
Analyses
After a circuit is functional, most design processes require an advanced analysis phase
where a predefined set of tests is run. Advanced analyses typically include parametric
sweeps, corners, and Monte Carlo analyses. These analyses are set up using the Sweep and
Corners windows. The following topics are discussed:
■ Sweeps on page 201
■ Corners Analysis on page 209
■ Monte Carlo Analysis on page 216
Note: Monte Carlo analysis is only supported for VSdE Native (Spectre) simulation.
For information about copying advanced analysis project items, see “Copying Advanced
Analysis Project Items” on page 223.
Sweeps
A sweep defines how global variables should be varied over tests to produce characterization
results. Characterization results are obtained by creating sweeps that define a range of
values that should be applied to one or more project parameters (global variables) when
running tests. Sweep values can be defined as either a list of values, or as a range of values
with specified increment (from-to-by). A sweep analysis is run by sweeping project
parameters over the test.
For each sweep, the analysis specified in each test is performed using the parameters
specified in the Sweep parameters table on the Sweep tab (see “Defining Sweep
Parameters” on page 203, next).
The Workspace sweep check box and drop-down menu are available for selection when
there are one or more by-reference workspace templates available (see “Creating and
Managing Workspace Templates” on page 64). Once the check box is marked, the Sweep
parameters table disappears and the following actions can be performed:
Action Description
Drop-down menu Select from the set of one or more by-reference workspace
templates available
Edit Display the sweep parameter setup information in the template
for editing
Templates Open the Workspace Templates form (see “Creating and
Managing Workspace Templates” on page 64)
The Sweep parameters table lists parameters and their sweep values. To add a parameter
to the table, do the following:
1. Click Add Param to add a row to the table.
Note: The Run check box is marked by default, thus enabling the parameter for the
sweep. See also “Enabling/Disabling Sweep Parameters” on page 206.
2. Select a parameter from the drop-down list in the Parameter column.
Note: The list of parameters represents the project and workspace parameters from the
tables in the VSdE main window. Only parameters that have not yet been defined, and
parameters that are not enabled (see “Enabling/Disabling Sweep Parameters” on
page 206), appear on this list.
3. Select one of the following sweep types from the drop-down list in the Type column,
providing values accordingly:
Important
You can use project and workspace parameters for List values only; other values
must be numerical. See “Understanding Parameters” on page 91 for information
about project and workspace parameters.
Type Values
List List of values (space-separated)
From Range of values specified as From, To, By (increment)
Log Logarithmic sweep from first value (Log), To, by Pts (total
number of points)
Log_Decade Logarithmic sweep from first value (Log_Decade), To, by Pts
(points per decade)
Log_Octave Logarithmic sweep from first value (Log_Octave), To, by Pts
(points per octave)
4. (Optional) Change the value in the drop-down list in the Alterable column:
Caution
Setting this flag incorrectly can cause invalid simulation results. Change
its value from the default (Auto) with extreme caution.
❑ Auto (this is the default selection) lets the environment determine whether a
parameter is alterable or not using the following algorithm: Swept parameters that
are numeric are alterable; swept parameters that are strings are not.
❑ Yes tells the environment not to generate multiple simulation input files when this
parameter is varied, because it can be swept using an alter statement in the input
file. (Swept parameters that are numeric are alterable.) Select Yes only when
multiple simulation input files are not required to vary this parameter.
❑ No forces generation of multiple simulation input files when this parameter is varied,
because it can only be swept by generating one input file for each value of the
parameter. (Swept parameters that are strings are not alterable.) Select No only
when multiple simulation input files are required to vary this parameter.
The Run check box to the left of each parameter in the Sweep parameters table is used to
enable/disable sweep parameters as follows:
■ When the Run check box is marked, the parameter is enabled for the sweep.
■ When the Run check box is not marked, the parameter is disabled for the sweep.
Multiple definitions can be specified for disabled parameters (see “Creating Multiple
Definitions for a Sweep Parameter” on page 206). Disabled parameters are saved to the
sweep file, and can be imported using the File – Import function on the Sweep tab (see
“Loading/Saving Sweep Parameter Definitions” on page 208).
Only enabled parameters can be saved to a CSV file (see “Loading/Saving Sweep Parameter
Definitions” on page 208), or converted to corners (see “Converting a Sweep to a Corners
Setup” on page 209).
Multiple definitions can be specified only for disabled parameters (see “Enabling/Disabling
Sweep Parameters” on page 206). Only one definition for a given parameter can be enabled
at a time.
Multiple definitions are saved to the sweep file, and can be imported using the File - Import
function on the Sweep tab (see “Loading/Saving Sweep Parameter Definitions” on
page 208).
Important
Disabled parameters cannot be saved to a CSV file (see “Loading/Saving Sweep
Parameter Definitions” on page 208), or converted to corners (see “Converting a
Sweep to a Corners Setup” on page 209).
The File drop-down menu offers the following selections for loading/saving sweep parameter
definitions (see “Adding a Sweep Parameter” on page 204 for more information about sweep
settings):
Import Displays the Open form for navigating to and selecting a .swp file
from which to import sweep parameter definitions
Note: Both enabled and disabled parameter definitions are
imported (see “Enabling/Disabling Sweep Parameters” on
page 206).
Important
Any loaded/imported information that matches what is already specified on the tab
overrides the existing information.
Corners Analysis
After a design is functioning as desired at the nominal process settings or model parameters,
a corners analysis can be used to verify performance of the design at the extreme-case
manufacturing process conditions. It is expected that if the design functions at worst-case and
best-case conditions, it works over most manufacturing variations to achieve a yield within
tolerance.
Tests are defined earlier in the flow to specify analyses (for example, AC, DC, transient),
simulation options, and global variable settings (see Chapter 5, “Test Setup from the
Schematic Window”). Corners analyses are run over one or more of these tests using a set
of parameters specified as corners (for example, model files, temperature, supply voltage).
For example:
Corner 1 Corner 2
typ bjt model typ bjt model
min mos model min mos model
vdd=3 vdd=5
temperature=25 temperature=55
Each analysis is run independently; however, multiple analyses can be run in parallel on
different hosts (see “Job Distribution Group Box” on page 228).
To open the New Corners Name form for creating a new corners analysis, do the following:
1. Choose File – New in the VSdE main window.
The New File window appears.
1. In the Corners name field, type a name for the corners analysis.
2. Click OK.
3. (Optional) Select or create a template on which to base the new corners analysis.
4. Click OK.
The Corners tab of the Corners window appears for specifying a corners analysis.
For each corners analysis, the analysis specified in each test is performed using the
parameters specified in the Corners parameters table on the Corners tab (see “Defining
Corners Parameters” on page 211, next).
Parameters available in the Corners parameters table come from the table of Project
Parameters specified in the VSdE main window (see “Project Parameters” on page 91). The
buttons that operate on the Corners parameters table of the Corners tab are described in
the following table:
Button Procedure/Function
Add Param Adds a new column to the right of the existing column in the table
and automatically enter the first available name in the list of
parameters when the test was created—see “Adding Corners
Parameters” on page 213
Note: To specify the desired parameter, scroll down the parameter
list on the context-sensitive drop-down menu and select the
parameter name.
Button Procedure/Function
Remove Param Removes the selected parameter column from the table
Add Corner Adds a row to the Corners parameters table for specifying values
in the corners parameter columns—see “Defining Corners
Parameter Values” on page 214
Remove Corner Removes the selected row from the table
Add Sweep Adds fields for specifying a sweep variable (Sweep), sweep type
(Type), and sweep values (either a List, or a range for From, Log,
Log_Decade, or Log_Octave)—see “Adding a Sweep of Corners”
on page 215
Remove Sweep Removes the selected sweep
File Presents a drop-down menu of the following selections:
Save As CSV File opens the Save As form for saving the corners
definitions to a .csv file
Load CSV File displays the Open form for navigating to and
selecting a .csv file containing corners definitions, according to the
following format:
Row,Run,parameterList*
1,RunBox1,valueList1*
2,RunBox2,valueList2*
…
For example:
Row,Run,$corner,$vdd_value,$temperature
1,1,nom,3,-40
2,1,slow,3.3,25
3,1,fast,3.6,125
Important
Any imported information that matches what is already
specified on the tab overrides the existing information.
The Workspace corners check box and drop-down menu are available for selection when
there are one or more by-reference workspace templates available (see “Creating and
Managing Workspace Templates” on page 64). Once the check box is marked, the Corners
parameters group box disappears and the following actions can be performed:
Action Description
Drop-down menu Select from the set of one or more by-reference workspace
templates available
Edit Display the sweep parameter setup information in the template
for editing
Templates Open the Workspace Templates form (see “Creating and
Managing Workspace Templates” on page 64)
$temperature $process_file
40 model_$temperature.mod
Note: The Run check box to the left of each corner is used to enable/disable a corner for a
given run.
Each row of the Corners parameters table represents one corners analysis. Everything
specified in that one row is run in one simulation. If there are four rows of information in the
table, then four corners analyses are performed per selected test. For example, for two
selected tests and two defined corners, four simulations are performed:
Corners over tests:
gain
bias
Corners parameters:
Corner 1 …
Corner 2 …
Simulations:
3. Select one of the following sweep types from the drop-down list in the Type column,
providing values accordingly:
Type Values
List List of values (space-separated)
From Range of values specified as From, To, By (increment)
Log Logarithmic sweep from first value (Log), To, by Pts (total
number of points)
Log_Decade Logarithmic sweep from first value (Log_Decade), To, by Pts
(points per decade)
Log_Octave Logarithmic sweep from first value (Log_Octave), To, by Pts
(points per octave)
Important
Sweep values must be numeric; they cannot be strings.
Note: The Yes in the Alterable column tells the environment not to generate multiple
simulation input files when this parameter is varied. This setting cannot be changed.
Worst-case and best-case situations—where every parameter is set to either the low end or
the high end of the allowed range—can be considered. Monte Carlo analysis supports the
following tasks:
■ Randomize specified values for parameters
■ Create multiple result files showing simulation results based on random values for
parameters
Before specifying a Monte Carlo analysis for simulating randomized iterations, the following
requirements must be met:
■ Functions required for Spectre Monte Carlo analysis have been specified in the model
files and on the device instances (see the Virtuoso Spectre Circuit Simulator
Reference for more information)
■ A sweep or corners analysis that references this netlist exists, and is enabled on the
Sweep or Corners tab (see “Sweeps” on page 201 or “Corners Analysis” on page 209)
■ Spectre is selected in the For simulator list area on the Monte Carlo tab and the
Enable check box has been marked
The Monte Carlo tab for Spectre circuit simulation features the following fields and controls
(see the Virtuoso Spectre Circuit Simulator Reference for more information):
Field Description
Seed Sets a starting point for the random number generator (a positive
integer)
Note: The spin box allows selection of an integer in the range 0-100,
or any positive integer value can be typed directly in the field
Start iteration Specifies the starting simulation iteration (a positive integer)
Number of runs Specifies the number of simulation iterations to perform (a positive
integer)
Statistical variation Radio button selects statistical variation type; one of
level
■ Process
■ Mismatch
■ All
The Job Status tab is displayed when one of the following actions is performed:
■ In the Sweep or Corners window, click Run to run the analysis.
■ In the Sweeps/Corners folder on the Files tab of the VSdE main window, select an
advanced analysis project item and choose Run – Run Sweep.
■ In the Sweeps/Corners folder, right-click an advanced analysis item and select Run
from the pop-up menu.
Column Description
Host List of host machines for the running job
Status Status (waiting, running, passed, or failed) of each simulation
Completed Running count of completed simulations
Count Total number of simulations for the sweep
Simulator Simulator used for the runs (for example, Spectre)
Note: After Run has been clicked, the Stop button can be used to terminate the simulations
before they are fully completed.
Button Purpose
Waveform Opens the Select Waveform Files form (see “Viewing Waveforms” on
page 278)
View Netlist Opens the netlist for the sweep in a read-only text editor window
View Script Opens the command file (script) for the sweep in a read-only text
editor window
View Output Opens the simulator output messages in a read-only text editor
window
View Errors Opens any simulation error messages in a read-only text editor
window
View Cellviews Opens the viewsFound file for the netlisted cells in a read-only text
editor window
View Other Opens the View Run File form for selecting a run file to display in a
read-only text editor window
Open Xterm Opens an xterm window in the simulation run directory
Note: The xterm command is specified using the ACV_XTERM
environment variable whose default value is:
ACV_XTERM = xterm -sb -sl 500 -geometry 100x30
Note: The View Netlist, View Script, View Output, View Errors, View Other, and Open
Xterm buttons are available only when the Keep sim directories check box is marked in the
Simulation options group box on the Run Options tab. The Waveform button is available
only when the Keep RAW files check box is marked in the Simulation options group box
on the Run Options tab. See “Simulation Options Group Box” on page 244.
Results Tab
Running advanced analyses causes one or more simulation files to be generated, and the
output results data is captured. A separate result file (.res) is created for each sweep/test
combination. Results file names are displayed on the Results tab. The default results
directory is either
■ project_dir/results/lib/cell/view/results, if there are no workspace
value sets defined, or
■ project_dir/results/lib/cell/view/wksp_set_name, where
wksp_set_name is the name of the currently active workspace value set (for example,
DefaultSet)
Button Description
MC Plot Presents a menu of the following items and initializes MC plotting using either
a Waveform Window or a Graph Window, depending on the Default for
Calculator and Direct Plot selection in the Defaults group box on the
Other tab of the Options window (see “Defaults Group Box” on page 242):
■ Filter
■ Specification Limits
■ Print Iteration vs Value
■ Print Correlation
■ Plot Histogram
■ Plot Scatterplot
■ Simple Yield
■ Conditional Yield
■ Multiconditional Yield
Note: The environment must be run from icms for successful initialization of
MC plotting with the correct lib/cell/view information.
Plot Opens the Plot Results window for the selected results file (see “Plotting
Results” on page 283)
View Opens a Results table window of measured results from all the runs in the
selected results file (see “Viewing Table Results” on page 255)
Note: Double-clicking the results file name is a shortcut for selecting the
results file name and clicking View.
Note: For more information on results analysis and display, see Chapter 10, “Viewing and
Analyzing Results.”
If the name of the dropped/pasted item is the same as another item already in the target
folder, then the Copy from ‘item_name’ form is opened:
The target folder can be one of the following in any open project in the main window of any
session:
■ The same as the source folder
■ A like project folder in another open project (for example, a sweep can only be copied
into a Sweeps/Corners project folder)
■ A like project folder in an open project in another session
9
Specifying Run Options
Run options are specified on the Run Options tab of the Sweep and Corners windows for
sweeps and corners analyses, and on the Options tab on the Run tab of the Test Setup
window for tests (see “Run Tab—Options Tab” on page 152). Run options are defined using
the following group boxes of information:
■ Overrides Group Box on page 225
■ Job Distribution Group Box on page 228 and Options Window on page 231
■ Simulation Options Group Box on page 244
Note: For Spectre simulations, the ACV_SPECTRE_MPS_TIMEOUT environment variable can
be used to control how long the Virtuoso® Specification-driven Environment (SdE) waits for
the simulator to start (before issuing an error message). The timeout value is specified in
seconds (for example, setenv ACV_SPECTRE_MPS_TIMEOUT 300). If the environment
variable is not set, then the timeout value is 600 seconds; if it is set to a number smaller than
60, then the timeout value is 60 seconds.
Important
The overriding DUT must have identical pin names and pin order as the DUT
referenced in the Tests folder on the Files tab. The parameters must also be the
same if values are passed to them.
Note: For Sweep and Corners, the cell name comes from the Cell specified in the test.
4. In the View field, type the view name.
When the analysis is run again, the environment evaluates the design found at the new
location.
For Netlist-only designs, the Overrides group box looks like this:
Note: For Sweep and Corners, the cell name comes from the Cell specified in the test.
4. In the View field, type the view name.
5. In the File field, type the name of the file containing the overriding DUT netlist.
6. In the Type field, select a file type from the drop-down list.
When the analysis is run again, the environment evaluates the netlist found at the new
location.
Options associated with job distribution consist of the following items specified in the Job
distribution group box:
Item Description
Run distributed Enables/disables distributed simulations
check box
Note: When marked, simulations are distributed across multiple
machines—or as multiple jobs on the same machine—as defined on
the Distribution tab of the Options window (see “Distribution Tab” on
page 232 and “Launching Remote Jobs” on page 229)
Item Description
Options button Opens the Options window (see “Options Window” on page 231)
Note: This window also appears when Tools – Options is selected
in the VSdE main window (see "Tools Menu" in the Virtuoso
Specification-driven Environment Reference).
Max jobs field Indicates the maximum number of simulation decks and
simultaneous simulations for sweeps/corners as described below
The Max jobs setting applies only to sweeps/corners and serves two purposes:
1. Provides guidance as to the number of simulation decks to generate (see “Generating
Simulation Decks for Max Jobs” on page 230).
2. Specifies the number of jobs to submit for distributed simulations (see “Running
Distributed Simulations and Max Jobs” on page 230).
Note: If the Run distributed check box is not marked, then as many as Max jobs are run
simultaneously on the local host; and setting Max jobs to 0 is the same as setting it to 1.
where hostName is a valid host machine name, and runDir is the project run directory path
(for example, /home/user/projects/sweep-test/run0).
Note: The rsh command must be able to succeed for the remote hosts specified, and these
remote hosts must have access to Cadence software. Common reasons for failure of the rsh
command include the requirement to enter a password on the remote system, or syntax
errors in files sourced on the remote system. In some cases, .rhosts files can be used to
remove the password requirement. Syntax errors must be found and fixed.
In order to customize the generated rsh command (for example, to specify the Cadence
project name on the remote host), the ACV_PRERUN_CMD and ACV_POSTRUN_CMD
environment variables can be set prior to opening the environment. The command string
specified by the ACV_PRERUN_CMD environment variable is run prior to the cd command; the
command string specified by the ACV_POSTRUN_CMD environment variable is run after the
./runSIM command.
For example, in order to specify the Cadence project name on the remote host, the selproj
command must be issued prior to the cd command. The ACV_PRERUN_CMD can be set prior
to opening the environment as follows to set the project to myProject:
setenv ACV_PRERUN_CMD "selproj myProject"
Tip
You can set the ACV_SIMRUN_RSH environment variable prior to running the
software to change the command used to launch remote jobs to something other
than rsh (or remsh on HP stations). The format is as follows:
setenv ACV_SIMRUN_RSH commandName
The command you select must accept the same arguments as rsh (or remsh).
For example:
setenv ACV_SIMRUN_RSH ssh
If the Run distributed check box is marked and the Queue command radio button is
selected instead (see “Queue command” on page 235), then all jobs are submitted to the
queue and the queuing software is used to manage the loading.
Options Window
To open the Options window, do one of the following:
➤ In the VSdE main window, choose Tools – Options (see "Tools Menu" in the Virtuoso
Specification-driven Environment Reference).
➤ In the Job distribution group box wherever it appears, click Options.
Note: The Job distribution group box appears on the Run Options tab of the Sweep
and Corners windows for sweeps and corners analyses, and on the Options tab on the
Run tab of the Test Setup window for tests (see “Run Tab—Options Tab” on page 152).
Distribution Tab
The Distribution tab of the Options window is used to configure the following items:
■ General distribution options on page 233—Check boxes for selecting general options for
distributed simulations.
■ Distribution list on page 233—When selected, enables the Machine file field (by default,
machine_list.mac in the .acv directory in the user’s home directory) and displays
the Machine file contents for distributed simulations.
Note: Multiple hosts can be specified in machine_list.mac. Also, the same machine
name can be listed multiple times to run that many jobs on the same machine.
■ Queue command on page 235—When selected, enables the Queue command field,
which supports special, third-party queuing software that takes a command and locates
a host on which to run it (while considering loading and other factors). Distributed
simulations are launched using the queuing command specified in the Queue
command field (for example, qrsh).
Option Description
Use Parallel ■ When marked, the environment will attempt to check out
Characterization Parallel Characterization licenses first, if available.
licenses if
■ When not marked, the environment checks out full licenses
available
only; Parallel Characterization licenses (even if available) are
not used.
Use C-shell for run ■ When marked, the C-shell is used to run distributed simulations
script (using either the Distribution list or the Queue command);
the ACV_PRERUN_CMD environment variable can be used to
source a file containing C-shell commands.
■ When not marked, the Bourne shell is used to run distributed
simulations.
Distribution list
To specify a machine list file containing a list of available host machines for distributed
simulation, do the following on the Distribution tab of the Options window:
1. Select Distribution list to display the default path to the machine_list.mac file (in
the Machine file field).
2. Click to open the Select File form.
3. Navigate to and select the desired machine_list.mac file that lists the host machines
to be used during distributed simulation.
4. Click Open.
The list of host machines, their availability, and locations are displayed in the Machine file
contents table, as follows:
Column Information
Host Name Valid host machine name
Local Directory (Optional) Directory mapping to apply when local jobs are run on a
Prefix remote machine when the local and remote paths are not the
same
Remote Directory
Prefix
Important
Specifying Local Directory Prefix and Remote
Directory Prefix information is only necessary when the
path to the local project directory is not exactly the same on
both the local and remote hosts. The Remote Directory
Prefix is substituted for the Local Directory Prefix for jobs
running on the remote machine. For example,
/nfs/home/user/projects on the local machine might
be /export/home/user/projects on the remote
machine. The /nfs part (or prefix) of the local path needs
to be mapped to /export (as the prefix) on the remote
host as follows:
Local Directory Prefix = /nfs
Remote Directory Prefix = /export
Start Time (Optional) Simulation start time for jobs submitted on the indicated
host (Host Name), specified as HH:MM on a 24-hour clock (for
example, 18:30 is 6:30 P.M.)
Note: Multiple hosts can be specified. Also, the same machine name can be listed multiple
times to run that many jobs on the same machine. The Local Directory Prefix and Remote
Directory Prefix information must be the same for all entries of the same Host Name.
Information in the Machine file contents table can be modified using the toolbar buttons as
follows:
Button Purpose
Add a row to the table for specifying host machine information
Queue command
For LSF job distribution software, the bsub command (specified in the Command field)
displays status information to standard output, which for the environment is the Simulation
Messages window. For example:
bsub -q sun64
Job <137> is submitted to queue <sun64>.
<<Waiting for dispatch ...>>
<<Starting on sun64.acv.com>>
The BSUB_QUIET environment variable can be used as follows to disable display of these
status messages:
setenv BSUB_QUIET yes
This setenv command must be specified prior to opening the environment (for example, in
.cshrc).
Other Tab
The following check boxes, radio buttons, and drop-down lists are available on the Other tab
of the Options window:
■ Warning Message and Display Check Boxes on page 239
■ Raw File Location Group Box on page 240
■ Job Completion Check Boxes on page 241
■ Defaults Group Box on page 242
■ Dump Autoplot Check Box on page 243
The following check boxes are provided on the Other tab of the Options window to
enable/disable warning messages as indicated:
■ Warn when setting active project and current project is open
When marked, for workspaces with multiple projects, a Warning message is displayed
when switching from one project to another, asking whether the currently active project
and windows/forms associated with it can be closed:
❑ Yes = Close the current project and its windows/forms
❑ No = Keep the current project open
Note: If the In the future, don’t show this warning check box (on the Warning
form) is marked, then the check box on the Other tab of the Options window is
unmarked automatically.
■ Warn when automatic renaming is done
This check box applies to schematic designs. The following prefix is applied to instance
names of any new components added on the Components tab of the Test Setup window
to avoid any naming conflicts associated with netlists generated by the schematic editor:
acv_. When this check box is marked, a Warning message is presented whenever the
prefix is applied. To suppress the warning, unmark the check box.
■ Warn when sweep steps exceed the suggested limit
When this check box is marked, a Warning message is presented when the number of
runs exceeds 500. To suppress the warning, unmark the check box.
Note: There is an absolute limit of 10,000 runs.
■ Display full lib/cell/view/testname
Unless this check box is marked, test names are displayed without the lib/cell/view
information (for example, when the Tests folder is open). When this check box is not
marked, the full lib/cell/view path for a test can be seen when the mouse cursor hovers
over the test name. To mark this check box so that the full lib/cell/view path for a test is
displayed, do the following:
a. Select Tools – Options.
The options in the Raw file location group box on the Other tab are as follows:
Option Purpose
Use default Uses the default results location—which is named according to
the default workspace value set, if one exists, or results if no
value set is defined:
results/lib/cell/view/DefaultSet
results/lib/cell/view/results
Specify location Specifies a new results directory location for simulation data
Add workspace/project Enabled only when Specify location is selected, and marked
name to path by default, this check box causes the workspace and project
names to be appended to the specified results directory
location for simulation data as follows:
resultsLocation/workspaceName/projectName
Tip
The ACV_RAW_PATH environment variable can be set prior to opening the
environment to specify an absolute path to the RAW file location (directory). The
ACV_RAW_PATH setting takes precedence over any project settings in this group
box.
The directory into which the simulation data files are written for a project can be changed as
follows:
1. In the Raw file location group box on the Other tab of the Options window, select
Specify location.
2. Click .
The Browse for Folder window appears.
3. Navigate to and select a results file folder.
4. Click OK.
The check boxes in the Job Completion group box on the Other tab are used to
enable/disable job completion options—what happens when a job completes:
The following items are available in the Defaults group box on the Other tab:
Item Purpose
Allow MATLAB Check box to enable/disable the following MATLAB measures
measures and plotting and plotting features:
■ Add – MATLAB Measures on the Measures tab of the
Test Setup window (see the Cadence MATLAB Measures
User Guide for more information)
■ Plot button on the Results tab of the Sweep or Corners
window (see “Results Tab” on page 221) and in the Results
table window (see “Viewing Table Results” on page 255)
Auto-flatten data file Check box to enable/disable the auto-flatten feature for
nodes on Results tab displaying data from Open Data Files and Associations in
the leaf window on the Results tab in the VSdE main window
(see “Flattening and Unflattening Branches” on page 252):
■ When the check box is marked, the feature is enabled, and
subsequently opened data files or associations are
flattened
■ When the check box is not marked, the feature is disabled,
and subsequently opened data files or associations are
displayed in a hierarchical tree
Default waveform Drop-down list selection of available waveform viewers:
viewer
■ AWD (see the Analog Waveform User Guide)
■ SimVision (see the SimVision User Guide)
■ WaveScan (see the WaveScan User Guide)
Note: The Default waveform viewer selection is also
displayed in the Plot viewer field in the Plot actions group box
on the Select Waveform Files form (see “Plot Actions” on
page 282).
Item Purpose
Default for Calculator Drop-down list of available Calculator/waveform viewer
and Direct Plot configurations to use whenever Calculator, Direct Plot, or MC
Plot is clicked on a form:
■ AWD (see the Analog Waveform User Guide)
■ WaveScan (see the WaveScan User Guide)
When marked, the Dump autoplot to file check box causes the contents of autoplotted
results to be saved to a graphics image file. You can select one of the following image formats:
■ png for Portable Network Graphics image format
■ tiff for Tag Image File Format
■ ps for PostScript image format
The file is stored in the docs subdirectory of the project and appears in the Documents
folder. The name of the file is
autoplot_setupName.imageType
where setupName is the name of the test or sweep setup and imageType is the Image format
you selected (png or tiff or ps).
Note: The file is regenerated each time results are autoplotted after simulation.
■ Output Dir
This button is used to open the RAW File Location form for specifying the following
options:
Option Description
Use global project options Selected by default, this option says to use the
settings specified in the Raw file location group
box on the Other tab of the Options window (see
“Raw File Location Group Box” on page 240), which
apply globally to the entire project
Options Available only when Use global project options is
selected, this button opens the Options window and
displays the Other tab (see “Other Tab” on
page 238)
Specify location When selected, indicates the output directory
location for this particular test/sweep/corners setup
The following simulation options apply only to sweeps and corners analyses; they do not
apply when running a single test:
■ Stop on first failure
When this check box is marked, all simulations stop when one contains errors. This
option is useful for long sweeps to verify that everything is in place for the simulations to
work.
When this check box is not marked, all simulations are run, regardless of whether any
fail.
■ Run incrementally
When this check box is marked, simulations are performed only if there are no previous
values for that sweep or corner condition.
Important
The Run incrementally option should be used carefully because it causes
simulations to be skipped. This option is never saved with the project item (for
example, by clicking Apply); rather, it must be marked explicitly before clicking Run.
When this check box is not marked, existing results are deleted, and the entire sweep or
corners analysis is run again.
10
Viewing and Analyzing Results
Various mechanisms are provided for viewing simulation and measurement results from
tests, sweeps, corners, and Monte Carlo analyses. Results can be displayed in tables of data,
plotted as waveforms, families of curves, or 3D surfaces, depending on the data available for
a run and the selected waveform viewer. Operating point and statistical information can also
available be displayed.
For tests:
■ Results tab of the VSdE main window (see “Using the Results Tab” on page 249)
displays all available results data
■ Plot drop-down menu on Results tab on the Run tab of the Test Setup window (see
“Run Tab—Results Tab” on page 155) plots waveforms using the Default waveform
viewer (see “Defaults Group Box” on page 242)
■ Plot actions group box on the Waveforms tab on the Run tab of the Test Setup window
(see “Run Tab—Waveforms Tab” on page 158 and “Plot Actions” on page 282) specifies
how to plot the selected waveforms
To display the Results tab, do the following in the VSdE main window:
➤ Click to select the Results tab.
The icon to the left of each item in a Result Files folder indicates the kind of results data
available as follows:
Icon Description
Test results
Sweep results
Corners results
Measure results
Where a results item has a single value, that value is displayed after an equals sign following
the result name. For swept parameters, the range of values is shown in square brackets
following the parameter name until a particular, single value can be identified as the results
tree is traversed.
Result types for the selected item are displayed in the leaf window on the right side of the
Results tab. Result types include:
For those users with a full MATLAB license and software, the following additional result types
might appear. The Allow MATLAB measures and plotting check box must be marked in
the Defaults group box on the Other tab of the Options window (see “Defaults Group Box”
on page 242).
Trace names from the data file are displayed in the leaf window.
The opened association appears as a folder in the Associations tree. The name of a created
association is based on the sweep and test names from the Results Files tree. The manner
of display of the association and its contents depends on the Auto-flatten data file nodes
on Results tab setting (see “Flattening and Unflattening Branches” on page 252).
Trace names from the association are displayed in the leaf window (see also “Displaying Leaf
Data” on page 253).
List in Rows Displays items in multiple columns using row-major order; for
example:
1 2 3
4 5 6
7 8 9
Plotting Traces
To open a new plot window and plot a trace from an open data file or association, do the
following:
1. In the leaf window, select a trace name.
2. Click .
The Path area lists all files that make up the selected association.
The Menu Entry column lists the items on the selected right-click pop-up menu. The * in the
Dft column denotes the default action for double-clicking the data item.
Note: Table results for tests are displayed on the Results tab on the Run tab of the Test
Setup window (see “Run Tab—Results Tab” on page 155).
Sweep parameters are listed in rows at the top of the window, in the Parameters table. Each
column indicates the set of parameter values used for that simulation. The measured result
values associated with each simulation are listed in the rows and columns on the bottom half
of the Results table window. A column of data is selected by clicking on the iteration number
heading at the top of the column. Multiple columns can be selected by click-dragging,
Shift-clicking, and/or Ctrl-clicking.
The buttons along the bottom of the Results table window are used as follows:
Button Description
Waveform Opens the Select Waveform Files form for the selected columns of results
(see “Viewing Waveforms” on page 278)
View Presents a drop-down menu of the following selections (see the Cadence
Analog Design Environment User Guide):
■ Mismatch Summary
■ Noise Summary
■ Noise Parameters
■ Stability Summary
■ Pole-zero Summary
■ Sensitivity
Button Description
Calculator Opens the Calculator window
Note: Which Calculator window appears depends on the Default for
Calculator and Direct Plot selection in the Defaults group box on the
Other tab of the Options window (see “Defaults Group Box” on page 242).
DP -> Calc Imports the plot expression from Direct Plot into the Calculator
Direct Plot Opens the Direct Plot Form, the design schematic window, and either a
Waveform Window or a Graph Window for plotting simulation results,
depending on the Default for Calculator and Direct Plot selection in the
Defaults group box on the Other tab of the Options window (see “Defaults
Group Box” on page 242)
Note: For information about the Direct Plot Form, see Chapter 10,
“Plotting and Printing” in the Cadence Analog Design Environment
User Guide.
Plot Opens the Plot Results window for plotting the selected column of results
(see “Plotting Results” on page 283)
Op Point Opens the Operating Point window (see “Analyzing Operating Point
Information” on page 263)
Statistics Opens the Results Statistics window (see “Viewing Statistics” on
page 276)
Format Opens the Cell Formatting window (see “Formatting Table Cells” on
page 260) for formatting cells in the selected rows
Tip
Use click-drag to select multiple rows.
Close Closes the Results table window
Help Opens Online Help for this window
The File menu (in the top right corner of the Results table window) is used to perform the
following tasks:
The following right-click pop-up menu appears whenever the right mouse button is clicked in
a table cell, or in a row or column header:
Tip
Use Shift-click or Ctrl-click to select multiple cells, rows, columns.
Caution
Sorting actions cannot be undone.
The following set of numeric formats for values in the table cells is supported:
The value in the Precision field, which defines the number of significant digits in the number
displayed, applies to all formats except Hexadecimal and Convert Digital.
The Reset button is used to reset the display to its default format.
The radio buttons (and spin box) on the Misc tab are used to specify the number of values
that can be displayed in each table cell:
Note: Formatting information is retained such that each time the file is opened for viewing,
and each time the file is updated with new results, the formatting changes are remembered.
The Results type drop-down list is used to select the operating point data type (for example,
dcOp_info, tranFinalOp_info). Device output variable information from the transient
operating point is collected automatically. Device output variable information from the DC
operating point is collected as follows:
1. For Analysis type = DC, click to mark the Save DC operating point check box on the
Analyses tab.
2. Click Apply.
3. Click Run.
Note: See the Virtuoso Spectre Circuit Simulator User Guide and Reference.
The items in this window are used as follows to view and manipulate operating point data,
and to annotate a schematic with this information:
Item Description
Results type Drop-down list of operating point data types (for example,
dcOp_info, tranFinalOp_info)
Component Drop-down list of components from the operating point results files
Filter Group box containing check boxes and radio buttons for filtering
data in the operating point information table on the bottom half of
the Operating Point window (see “Filtering Operating Point Data” on
page 264)
Instance Group box containing buttons and a highlight-color drop-down
Annotation selection list for instance annotation (see “Back-Annotating
Operating Point Information to Schematic Instances” on page 269)
Node Annotation Group box containing buttons and a drop-down selection list for
node annotation (see “Back-Annotating Node Information to a
Schematic” on page 271)
Set Up Params Button used to open the Operating Point Parameter and Expression
Setup form (see “Setting Up Component Parameters and Building
Expressions” on page 272)
File Menu for saving (as CSV), printing, exporting (as HTML), and
refreshing data (see “Using the File Menu” on page 274)
Parameter expression
When this check box is marked, an expression typed in the field is used to filter the table of
data. For example, the expression vbe<.2 filters the data such that only BJTs in the table
whose vbe parameter is less than two are shown. More complicated expressions, like
.6<vbe<.7&&vbe*ic>9e-6, can also be constructed. Parameter expressions can contain:
■ Any number of operating point parameters
■ Parameter Expression Operators for Filtering on page 265
■ Basic Mathematical Functions for Filtering on page 266
The following operators are supported in parameter expressions for operating point data
filtering:
The following basic mathematical functions are supported in parameter expressions for
operating point data filtering:
Component filter
When this check box is marked, the component instances listed in the table on the lower half
of the Operating Point window are filtered to display only those that meet the filtering criteria.
Table entries are filtered either by selecting specific components on the schematic (the ones
to be displayed in the table) or by specifying a filtering pattern to match, depending on the
radio button selected as follows:
■ The By selection radio button enables filtering by component selection. Only
components selected from the schematic are displayed in the table on the lower half of
the Operating Point window. Component selection is accomplished using either the
Select or the Selection List buttons as follows:
❑ The Select button is used to bring the schematic window to the foreground and
initiate schematic selection. Selected items appear in the operating point filter list
(see “Specifying a Filter List” on page 268). When a component is selected a
second time, it is removed from the list.
❑ The Selection List button is used to open the Operating Point Filter List (see
“Specifying a Filter List” on page 268), which contains a list of selected components
for operating point data filtering.
■ The By pattern radio button is used to enable filtering by pattern matching. The pattern
typed in the field determines which instances are displayed. For example, the string
*.I3.* specifies all components with I3 as part of its hierarchical string specifier. Only
components whose hierarchical specifiers match the filter pattern are displayed in the
table on the lower half of the Operating Point window.
Note: Those instances displayed in the table are the ones that will be annotated on the
schematic when the Annotate action is performed (see “Back-Annotating Operating Point
Information to Schematic Instances” on page 269).
The Selection List button in the Operating Point window is used to open the Operating Point
Filter List. This window displays the list of components, selected from a schematic, for which
operating point information will be displayed (on the lower half of the Operating Point window).
While reviewing the list, additional components can be selected (or removed from the list) by
clicking Select to bring the schematic window into the foreground in selection mode.
Selected components are added to the scrolling list area using the following format:
"hierarchical_pathname_of_component "
Multiple components are separated by a space. For example:
"/I3/I11/Q0" "/I3/I0/Q0"
Note: Both the Component filter check box and the Select radio button in the Filter group
box on the Operating Point window must be marked to enable the filtering specified in this
window.
Item Description
Annotate Button used to annotate the operating point parameter information set
up on the Operating Point Parameter and Expression Setup form (see
“Setting Up Component Parameters and Building Expressions” on
page 272), for the filtered set of components (as shown in the table on
the lower half of the Operating Point window), on the schematic
Annotate All Button used to annotate the operating point parameter information set
up on the Operating Point Parameter and Expression Setup form (see
“Setting Up Component Parameters and Building Expressions” on
page 272), for all components in the operating point results files, on
the schematic
Clear Annotate Button used to clear all operating point instance annotations from the
schematic
Highlight Drop-down list of colors for highlighting
Highlight Button used to highlight the components on the schematic for which
the operating point information is displayed on the lower half of this
window
Clear Highlight Button used to clear all component highlights from the schematic
Display Options Button used to open the Annotation Display Options form for specifying
instance annotation display options (see “Specifying Instance
Annotation Display Options” on page 270)
2. Select one of the following options for displaying instance annotations on the schematic:
Option Description
All params Selected operating point parameters are displayed on the
displayed at first schematic using the first annotation label (cdsParam), with each
annotation label parameter displayed on a separate line
One param Selected operating point parameters are displayed on the
displayed per schematic using one annotation label (cdsParam) for each
annotation label parameter
Note: If there are more parameters than available annotation
labels, then only as many parameters as annotation labels are
displayed.
3. Click OK.
Item Description
Node voltage Back-annotate node voltage information as
netName=voltageValue
Node current Back-annotate node current information as
netName=currentValue
Net name Back-annotate net name information
Pin name Back-annotate pin name information
None Clear all operating point node annotations from the schematic
2. Click Annotate.
Note: Node voltage annotation operations require that the symbols on the schematic have
cdsTerm labels.
Item Description
Component Drop-down list of components in the design
Available Scrolling list of component operating point parameters for the selected
component
Note: The environment provides an initialization file containing the
most commonly used operating point parameters for each component
type.
Annotate Scrolling list of component operating point parameters whose values
will be displayed in the table on the lower half of the Operating Point
window, and annotated on the schematic when either Annotate or
Annotate All is clicked (see “Back-Annotating Operating Point
Information to Schematic Instances” on page 269)
Add --> Button used to move a component operating point parameter or
expression from the Available list to the Annotate list
<-- Remove Button used to move a component operating point parameter or
expression from the Annotate list back to the Available list
Operating point Group box in which an expression can be defined (see the table,
expression below)
The following items appear in the Operating point expression group box on the Operating
Point Parameter and Expression Setup form:
Item Description
Name Field in which to type the name of the new expression (for example,
vdsdiff)
Expression Field in which to type the new expression, which can consist of any number
of operating point parameters and supports expression syntax as
described in “Parameter expression” on page 265 (for example,
vds-vdsat)
Note: The expression is evaluated and its value is displayed in a column of
the table on the lower half of the Operating Point window whose heading is
the name of the expression as typed in the Name field (above)—for
example, vdsdiff as the column heading, and the value of vds-vdsat
(evaluated) in a row cell of that column for every MOSFET instance
displayed in the table.
Clear Button used to clear the fields in the Operating point expression group
box
Note: This action does not delete the expression.
Add Button used to add the expression to the Annotate list
Delete Button used to delete the expression
Note: When an expression is deleted, it is also removed from the
Available/Annotate list.
Note: When an expression is selected in the Available or Annotate list, its name and
expression string are displayed in the Name and Expression fields in the Operating point
expression group box for modification or deletion.
Button Description
Save Saves changes made on the Operating Point Parameter and Expression
Setup form with the test so that these changes are available the next time
the test is opened
Note: Clicking OK saves the changes for the current Test Setup session
only. Once the Test Setup window is closed, the changes are discarded, and
the default settings are displayed the next time the test is opened.
Reset Reloads the default set of parameters to back-annotate for each device
type from $ACV_ROOT/admin/simulatorNameOpPtSetup.ini,
where simulatorName is either the name of the simulator (for example,
spectre), or default
To save the the data presented in the Operating Point window as a file of comma-separated
values, do the following:
1. Choose File – Save As to open the Save As form.
2. Specify a name and location.
3. Click Save.
To send the data presented in the Operating Point window to the printer
1. Choose File – Print to open the Print Setup form.
2. Specify Print Setup information (see also “Printer Setup” in the Virtuoso
Specification-driven Environment Reference).
3. Click OK.
To export the data presented in the Operating Point window in HTML format, do the following:
➤ Choose File – Export to HTML.
For sweeps, to export this data for all sweep values to a single HTML file, do the following:
➤ Choose File – Export All Sweeps to HTML.
To export operating point data for all components (not just those presented in the Operating
Point window) in HTML format, do the following:
➤ Choose File – Export All Components to HTML.
For sweeps, to export all operating point data for all sweeps of all components to a single
HTML file, do the following:
➤ Choose File – Export All Sweeps of All Comps to HTML.
To enable automatic refresh of the data presented in the Operating Point window when the
results file is updated after rerunning a test/sweep/corners analysis, do the following:
➤ On the File menu, click to mark Auto Refresh.
To force a refresh of the data presented in the Operating Point window from the results file,
do the following:
➤ Choose File – Refresh.
Viewing Statistics
The Results Statistics window displays a table of the following statistics for sweep results (for
example, Monte Carlo, corners):
Statistic Description1
Minimum Displays the minimum value of each result
Maximum Displays the maximum value of each result
Average Displays the average (sample mean, X) value of each result, calculated by
summing all the values and dividing by the number of values
Median Displays the sample median value of each result, which is the middle value
for an odd number of values or, for an even number of values, is calculated by
summing the two middle values and dividing by two
Statistic Description1
Std. Dev. Displays the sample standard deviation of each result, which is calculated as
follows:
n
2
n
2
n ∑ X i – ∑ X i
i=1 i = 1
S = ----------------------------------------------------
n(n – 1)
Variance Displays the sample variance of each result, which is calculated as follows:
n
2
∑ (Xi – X )
2 i=1
S = -----------------------------------
-
n–1
The Format button is used to open the Cell Formatting window (see “Formatting Table Cells”
on page 260) for formatting cells in the selected rows.
Viewing Waveforms
The Select Waveform Files form is used to specify the files and data to load for viewing in the
selected waveform viewer (Plot viewer). One or more waveform files can be selected for
display. The Auto-plot check box (upper right corner of the form) is used to enable/disable
automatic plotting of those signals for which the Plot check box is marked upon successful
completion of the simulation.
Note: Waveforms from different analysis types are plotted in separate windows. All measure
results are plotted in a single window.
The Select Waveform Files form is opened when one of the following actions is performed:
■ Click Waveform on the Job Status tab of the Sweep or Corners window.
■ Click Waveform in the Results table window after selecting one or more columns to
make the Waveform button active.
Item Description
Filter analysis Drop-down menu selection of the analysis type (for example,
Transient, AC, SP, XF) whose results files will be listed in the Select
waveform files to view area for selection
Filter pattern Field in which a file filtering pattern can be typed to narrow the list of
files; for example, results_0.1*
Select waveform List of results files for the selected analysis type (Filter analysis);
files to view one or more files can be selected for results display
Tip
Use Shift-click or Ctrl-click to select more than one file; use
the Select All or Select None button to select all or none of
the files, respectively.
Note: For Filter analysis types AC and XF, the right-click pop-up
menu in this area provides a list of trace modifiers (see “AC, XF, and
SP Modifiers for AWD” on page 159). For Filter analysis type SP, the
right-click pop-up menu provides a selection of plot types (see
“S-Parameter Plot Types” on page 160).
Select All Button for selecting all files in the Select waveform files to view
scrolling list area
Select None Button for selecting none of the files in the Select waveform files to
view scrolling list area
Signal Specification
For most analysis types, the table at the top of the form is labelled Signals to plot. For XF
analysis (Spectre), Signals to plot becomes Instances to plot. For SP analysis (Spectre),
Signals to plot becomes S-Parameters to plot.
The table at the top of the form has the following columns:
Column Description
Plot Check box for selecting an item to plot; when the Auto-plot check
box is also marked, those items for which the Plot check box is
marked are automatically plotted upon successful completion of the
simulation
Name ■ For most analysis types, Name is a net or terminal specifier (for
example, /vcoOut)
■ For XF analysis (Spectre), Name is a valid XF instance name
(for example, /I8, /V6, /Rload)
■ For SP analysis (Spectre), Name entries are expressed using
S-parameter data matrix syntax (for example, S11, S12, S21,
S22)
Type ■ For most analysis types, Type is either net or terminal
■ For XF analysis (Spectre), Type is inst
■ For SP analysis (Spectre), Type is sp
Color Drop-down menu of available colors for plotting the item
The following buttons are available for populating the signal/instance/S-parameter table:
Button Description
Copy Save List Copies the list of items from the Nodes/currents save list on the
Sim Options tab of the Test Setup window (see “Sim Options
Tab—Spectre Tests” on page 198)
Select Opens the schematic window in the foreground for net or terminal
selection; one or more nets or terminals can be selected
Add Adds another row to the Signals to plot table
Delete Removes the selected row from the Signals to plot table
Plot Actions
The Plot actions group box features the following drop-down menus:
Menu Action
Plot viewer Selects one of the following for waveform viewing:
■ AWD (see the Analog Waveform User Guide)
■ SimVision (see the SimVision User Guide)
■ WaveScan (see the WaveScan User Guide)
Note: The <Use Default> selection causes the specified Default
waveform viewer to be used (see “Defaults Group Box” on page 242).
Plot window Selects one of the following plotting choices:
■ Clear, then plot—Clears any traces currently plotted in the
current window, then plots the selected items. This is the default
setting.
■ Plot in new window—Opens a new window and plots the
selected items. This window becomes the current window.
■ Overlay in current plot—Plots the selected items as an overlay
in the current plot, leaving whatever traces might already be there
intact.
The Plot button is used to display the named signals according to the specified plot actions.
Note: The Plot button causes the named nets to highlight on the schematic, using the
waveform display colors. Because the Virtuoso Schematic Editor is not in “Select” mode,
pressing esc does not return the nets to their unselected colors. To return the nets on the
schematic to their unselected colors, do one of the following:
■ On the Select Waveform Files form, click Select (see “Signal Specification” on
page 281), then press esc in the schematic window.
■ In the schematic window, choose Design – Probe – Remove All.
Plotting Results
The appearance of the Plot Results window depends on the Defaults settings on the Other
tab of the Options window (see “Defaults Group Box” on page 242) as follows:
■ If the Allow MATLAB measures and plotting check box is marked, then the Plot
Results window displays MATLAB plotting options (see “Plotting 2D and 3D Surface
Results” on page 284).
■ If the Allow MATLAB measures and plotting check box is not marked, then the Plot
Results window displays plotting options for the selected Default waveform viewer
(see “Plotting 2D and Curve Family Results” on page 283).
Button Purpose
2D Plot Plots the selected plots
Curve Family Plots the curve family results
Options Opens the Plot Options Side Panel for 2D and Curve Family Results on
page 286
Close Closes the Plot Results window
Help Opens Online Help for the Plot Results window
Important
When the Allow MATLAB measures and plotting check box is marked in the
Defaults group box on the Other tab of the Options window (see “Defaults Group
Box” on page 242), the Plot Results window displays MATLAB plotting options,
regardless of the Default waveform viewer selection. See “Plotting 2D and 3D
Surface Results” on page 284.
Note: Available combinations of results for plotting are expanded by clicking on the + to the
left of an item in the tree. When a combination is fully expanded, a list of available plots and
curve families for that combination is displayed.
Button Purpose
2D Plot Plots the selected plots
3D Surface Plots the selected surfaces
Button Purpose
Close Plots Closes all open plot windows
Options Opens the Plot Options Side Panel for MATLAB Users on page 287
Close Closes the Plot Results window
Help Opens Online Help for the Plot Results window
Note: Available combinations of results for plotting are expanded by clicking on the + to the
left of an item in the tree. When a combination is fully expanded, a list of available plots and
surfaces for that combination is displayed.
The following drop-down selections are available on the side panel of the Plot Results window
for plotting 2D and curve family results:
The check boxes on the side panel are used enable/disable the following options:
Option Description
Show title Display the results file name above the plot
Show grid Display grid on plot
Display full file names Display the full path and file name of the results file in the Select
desired plot(s) list area area name
Note: When the check box is not marked (which is the default),
only the results file name is displayed, without the path.
Show results vs. When marked, collapses the Select desired plot(s) tree and
results plots rebuilds it to include plot selections for measurement results
versus other measurement results, in addition to the standard
measurement results versus swept parameters
When the Allow MATLAB measures and plotting check box is marked in the Defaults
group box on the Other tab of the Options window (see “Defaults Group Box” on page 242),
the following settings are available on the side panel of the Plot Results window:
2D Plot Options
Item Description
Style Drop-down list box of the following 2D plot styles:
■ Normal
■ Area
■ Bars
■ 3D Bars
■ Stem
■ Scatter + Histogram (useful for Monte Carlo analysis)
Show grid Check box to enable/disable display of the grid in the plot window for the
next item plotted
For example, to display a scatter plot and histogram of Monte Carlo results, do the following:
1. Select the desired plot results.
2. In the Style field, select Scatter + Histogram from the drop-down menu.
3. Click 2D Plot.
3D Surface Options
Item Description
Style Drop-down list box of the following 3D plot styles:
■ Normal shaded
■ Shaded contour
■ Normal meshed
■ Contour meshed
■ Floor meshed
■ Waterfall
Show grid Check box to enable/disable display of the grid in the plot window for the
next item plotted
Subplot Options
Item Description
Enable subplot Check box to enable/disable display of multiple plots in a single plot
window, up to Rows*Columns
Item Description
Rows and Number of rows and columns of plots that can be displayed in the plot
Columns window, up to a maximum of 10
Undo Last Remove the traces from the last plot in the plot window and frees that
position in the subplot grid for a new subplot
Note: A wide range of functionality is provided in the plot visualization tool that is shipped
with the environment. See MATLAB user documentation for more information.
11
Verifying and Comparing Designs
The specification (spec) sheet tool provides a mechanism for managing and organizing
simulation results for different purposes:
■ Create a spec sheet for design reviews
■ Generate a summary report of pass/fail results for measurements
■ Display results for quick inspection of multiple simulation runs
■ Verify a design against specifications
■ Compare simulations of two implementations of a design (for example, behavioral vs.
transistor-level)
Each of these alternatives opens the New Specification form. The Results vs. spec radio
button is used to specify a spec sheet for viewing results against specifications; the
Comparison of two designs radio button is used to specify a spec sheet that compares two
sets of results.
Button Description
Add See “Adding and Removing Specifications” on page 301
Remove
Compare See “Comparing Results” on page 311
Expand See “Loading Result Data Automatically” on page 305
Properties See “Changing Results Directories” on page 312
Options Opens the Spec Sheet Options form for enabling the Allow results keys
option, which causes the “result keys” (parameter names) to be displayed
along with the measure names on the drop-down list in the Measure
column
OK Save changes and close the window
Cancel Close the window without saving changes
Apply Apply changes without closing the window
Help Opens Online Help for this window
Icons Description
Moves selected row up or down
Icons Description
Opens the Test Setup window for the selected row
Displays detailed pass/fail information for the selected row (see “Investigating
Pass/Fail Information” on page 313)
Exports entire spec sheet to an HTML file in the docs directory for the
project, named according to the spec sheet name; for example, a spec sheet
called my_spec results in an HTML file called my_spec.html
Opens the Print Setup form (see also “Printer Setup” in the Virtuoso
Specification-driven Environment Reference)
Opens the Save As form for selecting a name and location for the file of
comma-separated values to which to save the data
Column Description
Sweep Lists the sweeps and corners used to generate the results files found in the
specified results directory or directories
Note: For Comparison of two designs, only sweeps and corners from
results files that exist by exactly the same name in both results directories
are available for comparison. Both the file names and the sweeps/corners
names must match exactly.
Test Lists the tests used to generate the results files found in the specified
results directory or directories; specific to the specified sweep (in the
Sweep column), if one is selected
Note: For Comparison of two designs, only tests from results files that
exist by exactly the same name in both results directories are available for
comparison. Both the file names and the test names must match exactly.
Measure Lists the measures from the specific test or sweep-test results file
Note: For Comparison of two designs, all measures in both files must
match exactly to be available for comparison. The set of measures and their
names must match exactly.
Conditions Lists the values of each swept parameter as a range of values; for
example, if a parameter p is swept through a list of values x, y, and z, then
the Conditions column reflects the values as a range: p=x-z
Note: The conditions are set up on the Sweep tab of the Sweep window in
the Sweep parameters table (see “Sweeps” on page 201) or on the
Corners tab of the Corners window in the Corners parameters table (see
“Corners Analysis” on page 209).
Pass/Fail Displays a Pass or Fail status for each measure: Pass means that the
measured value for all runs in the sweep fall between the Min Spec and
Max Spec values, inclusive. Fail means that one or more of the measured
values falls outside the bounds of the Min Spec and Max Spec values.
To investigate the measured values further, see “Investigating Pass/Fail
Information” on page 313.
The Spec window is used to manage and organize simulation results so that they can be
compared against a set of specifications. The name of the spec sheet appears in the title bar.
The Min Spec, Max Spec, Min Value, and Max Value columns are specific to Results vs.
spec spec sheets:
Column Description
Min Spec Cells for specifying the minimum and maximum acceptable values for the
measured value such that measured values outside this range result in a
Max Spec
pass/fail status of Fail:
Min Spec ≤ measured_value ≤ Max Spec = Pass
Note: If Min Spec is not specified, then measured values less than or equal
to Max Spec are considered passing.
Min Value Minimum and maximum measured values for all simulations for that
sweep/test/measure
Max Value
Note: For a single simulation (not a sweep), the resulting value is displayed
in both the Min Value and Max Value cells. See also “Investigating Pass/Fail
Information” on page 313.
Note: See also “Using the Cell Formatting Window” on page 308.
The Comparison Spec window is used to manage and organize simulation results of one
design so that they can be compared against another design—for example, to verify that the
simulated results of a behavioral model match those of the original circuit design; to verify one
circuit against another, or one behavioral model against another. The name of the spec sheet
appears in the title bar. The Ver Tol, Tol, Min Comp, and Max Comp columns are specific
to Comparison of two designs spec sheets (see the table, below).
Column Description
Min Comp Minimum and maximum calculated differences between the measured
values for the two designs for the indicated sweep/test/measure
Max Comp
Note: See also “Investigating Pass/Fail Information” on page 313.
Tol Tolerance (see Ver Tol description, next)
Column Description
Ver Tol Verification tolerance, selected from a drop-down list, that specifies how the
results from the two designs are compared. Pass/fail criteria are as follows:
■ Absolute—the absolute value of the difference in measured values for
the two designs must be less than the tolerance specified:
Pass = Design1value – Design2value < Tol
Note: See also “Using the Cell Formatting Window” on page 308.
The absolute tolerance (the value in the Tol cell) is 2n. Global parameter P1 is swept through
some set of values (for example, 1, 2, 3). The measurement function returns the values
shown in the Design1 Value and Design2 Value columns. The calculated difference
between these values is shown in the Difference column. A status of Pass is returned only
for cases where Design1value – Design2value < 2n :
The Min Comp for this set of values is 0. The Max Comp is 2n. Because at least one of the
detailed comparisons for the set failed, the pass/fail status is displayed as Fail on the spec
sheet. (See also “Investigating Pass/Fail Information” on page 313.)
The Remove button is used to remove the selected row. To remove a specification from the
table, do the following:
1. Click anywhere in the row to be removed.
2. Click Remove.
Item Description
Design results Field for specifying design results directory
Tip
Click to open the Select Results Directory
(see “Select Results Directory Form” on
page 304 form).
Specify sweep and test name Check box to enable the Sweep and Test drop-down
selection fields
Sweep Drop-down selection list of available sweeps
Test Drop-down selection list of available tests
Item Description
First design results Fields for specifying two design results directories
Second design results
Tip
Click to open the Select Results Directory
(see “Select Results Directory Form” on
page 304 form).
Specify sweep and test name Check box to enable the Sweep and Test drop-down
selection fields for First design and Second design
Sweep Drop-down selection list of available sweeps
Test Drop-down selection list of available tests
(Optional) Select the sweep and test for each of the two designs as follows:
1. Click to mark the Specify sweep and test name check box to enable the Sweep and
Test drop-down selection fields.
2. Select a sweep name from the drop-down list in the Sweep field for each of the two
designs.
3. Select a test name from the drop-down list in the Test field for each of the two designs.
4. Click OK to save changes and close the form.
The Result tree on the top half of the form can be navigated and expanded to select a new
lib/cell/view/results location, which appears in the Result directory expanded information
area on the bottom half of the form. The Results files area lists all results file (.res) found
in the selected location.
Specified Expand
Sweep Adds a row for every combination of sweep, test, and measure from
every result file for this sweep in the specified results directory (see
“Adding and Removing Specifications” on page 301)
Sweep and Test Adds a row for each measure associated with this sweep and test
The columns of a spec sheet can be moved and auto-sized using the and buttons,
respectively. User Columns can be added, renamed, and deleted using the column right-click
pop-up menu. Any column can be hidden or unhidden.
The rows of a spec sheet can be moved and auto-sized using the and buttons,
respectively. Rows of results data can be added and removed using the Add and Remove
buttons. The Expand button can be used to expand the set of results displayed in the spec
sheet automatically (see “Loading Result Data Automatically” on page 305). Detailed
information about the data in the current row is available using either the row right-click
pop-up menu or the , , and toolbar buttons.
Note: Row and column auto-size actions operate on all rows or columns in the spec sheet
either to adjust the width (for columns) or height (for rows) to fit the entire contents of the cells.
The contents of a cell can be edited or formatted using the cell editing right-click pop-up
menu. Multiple cells can be formatted using the multiple-cell formatting right-click pop-up
menu.
Menu Description
Column Pop-Up: Right-click a column heading to see this menu.
■ Hide Column: Hides the current column
■ Unhide Column: Presents a submenu of hidden columns;
click to unhide
■ Insert Column: Inserts a User Column to the right of the
current column; type a custom name upon creation and
press Return
■ Rename Column: Highlights the User Column name for
editing; type a new name and press Return
■ Delete Column: Deletes the current User Column
Cell Editing Pop-Up: Right-click a highlighted text string in a cell to see this menu.
■ Undo: Undoes the last edit action
■ Cut: Deletes the selected text while retaining a copy in the
paste buffer
■ Copy: Copies the selected text to the paste buffer
■ Paste: Pastes the contents of the buffer at the cursor
location
■ Delete: Deletes the selected text
■ Format Cells: Opens the Cell Formatting window for the
current cell (see “Using the Cell Formatting Window” on
page 308)
Menu Description
Row and Multiple Cell Right-click in a row or in a highlighted set of cells to see this
Formatting Pop-Up: menu. (Click-drag to highlight a set of cells.)
■ Format Cells: Opens the Cell Formatting window for the
selected cells (see “Using the Cell Formatting Window” on
page 308)
■ Plot: Opens the Plot Results window for the Measure
Toolbar shortcuts: results in the selected row (see “Plotting Results” on
page 283)
Plot
■ Show Details: Displays detailed pass/fail information for
Show Details the selected row (see “Investigating Pass/Fail Information”
Show Test on page 313)
■ Show Test: Opens the Test Setup window for the selected
row
■ Expand: See “Loading Result Data Automatically” on
page 305
Tip
Use click-drag to highlight multiple cells for formatting.
In either case, the Cell Formatting window is presented for formatting the selected cells. The
tabs of this window are used to perform the following tasks:
■ Format the display of the data in the cells—see “Format Tab” on page 308
■ Format the alignment of the data in the cells—see “Alignment Tab” on page 310
■ Specify text, background, and foreground colors, and background pattern for the cells—
see “Color Tab” on page 310
■ Format the borders for the cells—see “Border Tab” on page 311
Format Tab
The Format tab is used to specify a display format for the data in a cell or cells. The following
format styles are available:
The various fields available depend on the Format style selected and can include
Field Description
Units Text string appended to the display of the number (for example, V for “Volts”,
H for “Henries”, Hz for “Hertz”, etc.)
Precision Number of significant digits displayed
Suffix Drop-down list of engineering suffixes to be used when displaying the
number: {T (1e12), G (1e9), M (1e6), K (1e3), m (1e-3), u (1e-6), n (1e-9),
p (1e-12), f (1e-15), a (1e-18)}
The Preview pane provides a sample of the selected format style using the contents of the
current cell.
Alignment Tab
The Alignment tab is used to specify the horizontal and vertical alignment of the data in the
cell. The Preview pane provides a sample of the selected alignment using the contents of
the current cell. Additionally, the check boxes provide selections for wrapping text and
displaying multiple lines in the cell.
Color Tab
The Color tab is used to select colors (from drop-down lists) for the following items:
■ Text: data displayed in the cell
■ Foreground: main color of the cell
The Preview pane provides a sample of the selected colors using the contents of the current
cell.
Border Tab
The Border tab is used to format the border lines along the Left, Right, Top, and Bottom of
the cell. Line width is selected using the radio buttons in the Type group box on the tab; line
color is selected from a drop-down list of colors in the Color field. The Preview pane provides
a sample of the selected bordering using the contents of the current cell.
Comparing Results
The Compare button is used to compare the result values either as a design against a
specification (using Min Spec and Max Spec; see “Results vs. Spec Columns” on page 298)
or as two designs (using Ver Tol and Tol; see “Comparison Spec Columns” on page 299).
All result values for the indicated measure must fall within the passing range in order for a
pass/fail status of Pass to be reported. If one or more of the comparison values falls outside
the passing range, then the pass/fail status is reported as Fail.
For pass/fail criteria specific to the different spec sheet types, see:
■ Results vs. Spec Columns on page 298
Tip
Click to open the Select Results Directory form for changing the results file
associated with the selected row (see “Select Results Directory Form” on
page 304).
The Pass/Fail column displays either Pass or Fail, depending on whether the comparison
criteria were met or not. If at least one of the simulations in the details window has a Fail
status, then the Pass/Fail column of the spec sheet displays Fail. Pass appears with a green
background, Fail appears with either a yellow or a red background as follows:
■ Yellow: Fewer than 10% of the measured values failed to fall within the specified range
■ Red: 10% or more of the measured values failed to fall within the specified range
For example:
Button Description
Displays simulation results in a table (see “Viewing Table Results” on
page 255)
Note: For Comparison of two designs details, first opens the Select Result
File form for selecting which of the two result tables to display.
Opens the Select Waveform Files form for viewing the simulation waveforms
(see “Viewing Waveforms” on page 278)
Opens the error log file for a failed or passed sweep result in a read-only text
editor window
Button Description
Refreshes the results
Exports the displayed data to an HTML file in the docs directory for the
project, named according to the measure name; for example, the details page
for a measure called myMeasure results in an HTML file called
myMeasure.html
Opens the Print Setup form (see also “Printer Setup” in the Virtuoso
Specification-driven Environment Reference)
Opens the Save As form for selecting a name and location for the file of
comma-separated values to which to save the data
12
Calibrating Behavioral Models
The Model Calibration tool is used to generate a silicon-calibrated behavioral model for a
transistor-level design. A behavioral model is a high-level description that models the
behavior of a mixed-signal function or design. For example, a behavioral model of a system
design is often useful for simulating the system in a top-down design flow. A calibrated model
is a behavioral model that uses characterized results to define parameter values.
Characterized results are provided in one of the following ways:
■ Run simulations and sweep parameters of the transistor-level circuit to generate a
look-up table of calibrated values
■ Substitute a constant, calibrated value for a parameter, obtained as a result of simulation
or by some other means
For example, the original behavior model might be expressed using an equation as follows:
vctrl=V(in);
freq=100M*vctrl+86M;
After successful simulation of the original design, the freq parameter is modified to
reference a table model for model calibration:
freq=$table_model(vctrl,"vco_frequency.vat", "");
Tip
The Library List button is used to open the Library List Manager form for viewing,
and changing, the list of available libraries (see “Library List Button” on page 184).
3. In the cell tree on the lower half of the form, navigate to and select the file containing the
behavioral model description.
Tip
Click + to the left of an item to expand the tree.
4. Click OK.
The Input tab of the Model Calibration window displays the name and location of the input
behavioral model. The Top module field contains the name of the module definition of the
behavioral model in the specified file.
Specifying Parameters
To specify parameters for model calibration, do the following on the Parameter Setup tab of
the Model Calibration window:
1. In the Core model parameters scrolling list, click to select a parameter.
2. In the Modeling method group box, select one of the available modeling methods (see
“Modeling Methods” on page 324).
3. On the lower half of the Parameter Setup tab, specify the result details (see “Result
Details” on page 326).
The behavioral model is calibrated by passing in either single result values or look-up tables
created from result values. The set of result details available for selection (on the lower half
of the tab) depends on which modeling method is specified.
Modeling Methods
Different modeling methods are available for different parameter types as follows:
Important
The Not used selection is used to indicate that no value will be passed to the
parameter during model calibration.
Method Description1
2D table Creates a look-up table whose name is used in a $table_model function
call (see “Interpolating with Table Models” in the Cadence Verilog-A
3D table
Language Reference) in the behavioral model
4D table
Note: The name of the table model file (.vat) is based on the model
parameter name; the cell name is used as a prefix (for example, divider_…)
in the calibrated model file to make sure that the table file name is unique for
different cells that use the same parameter names.
A table file is built using the first data found. For example, if temperature,
voltage, and process are varied in the sweep, and the table is created using
temperature and voltage, then the values for the first found process model
value are used; unless conditions are used to specify the process model file
to be used. The following fields and drop-down lists are used for specifying
information when the selected modeling method is nD Table:
■ Results
■ Sweep
■ Test
■ Result
■ Y-var
■ X-var (for 3D and 4D tables only)
■ W-var (for 4D table only)
■ Result list vs. result list check box
The Advanced button is used to open the Advanced Settings for Tables form
for specifying results filtering conditions and extrapolation method for building
look-up tables (see “Specifying Conditions for Tables” on page 329).
Method Description1
Constant Indicates that a single value is to be passed to the behavioral model
parameter. The value can be one of the following:
■ constant—for example, 1.2
■ project or workspace parameter—for example, $vdd_value
■ expression—for example, ($vdd_value/2)
Result Indicates that a single value from a measured result is to be passed to the
behavioral model parameter. When multiple values (for example, for different
sweeps, or for a measure with multiple values) are encountered, only the first
value is passed to the behavioral model parameter. The following items are
used for specifying information when the selected modeling method is
Result:
■ Results
■ Sweep
■ Test
The Optional condition field is used for specifying a conditional filter for the
selected result data (see “Condition” on page 330).
1. For information on the fields and lists, see the appropriate section under “Result Details” on
page 326.
Result Details
The set of result details available for selection depend on the modeling method (see
“Modeling Methods” on page 324). The following procedures are outlined:
■ Changing the Results Location on page 327
■ Selecting a Sweep on page 327
■ Selecting a Test on page 328
■ Selecting a Result on page 328
■ Selecting Parameters on page 328
■ Specifying Conditions for Tables on page 329
To change the location of the results to be used for model calibration, do the following:
1. At the right end of the Results location field, click .
The Select Results Directory form appears.
Selecting a Sweep
To select the sweep that contains the test from which the result is to be obtained, do the
following:
1. At the right end of the Sweep field, click the down arrow.
2. From the drop-down list, select a sweep name.
Tests corresponding to the selected sweep are available for selection from the drop-down list
in the Test field (see “Selecting a Test” on page 328).
Selecting a Test
To select the test to be used to obtain the measured result for model calibration, do the
following:
1. At the right end of the Test field, click the down arrow.
2. From the drop-down list, select a test name.
Results corresponding to the selected test are available for selection from the drop-down list
in the Result field (see “Selecting a Result” on page 328).
Selecting a Result
Selecting Parameters
To select a swept project or workspace parameter from one of the variable lists, do the
following:
1. At the right end of the ?_var field (for example, Y_var), click the down arrow.
2. From the drop-down list, select a parameter.
Note: One value of the selected Result (see “Selecting a Result” on page 328) is used for
multiple values of the selected parameter (for example, W_var) and other selected
parameters (for example, X_var, Y_var).
The number of parameters that can be selected depends on the modeling method as follows:
Important
When the Result list vs. result list check box is marked, the string (result) is
appended to the field label—for example, Y_var (result)—and the value in the field
specifies a measured result that is paired with the selected Result. The first "pair"
used in the look-up table is one set of sweep values for the selected parameter
versus one set of sweep values for the selected Result.
To specify advanced settings for 2D, 3D, and 4D tables, do the following:
1. On the Parameter Setup tab, click Advanced.
The Advanced Settings for Tables form appears.
2. In the Condition field, specify a conditional filter for the selected result data (see
“Condition” on page 330).
3. In the Table name field, specify the name of the file to which the table will be written (see
“Table name” on page 330).
4. Click OK.
Condition
The Condition field is used for specifying a conditional filter for the selected result data. One
value is passed to the behavioral model parameter. If multiple values are found, then the
condition is used to select a subset from which the first value is chosen.
For example, the following conditions might apply when qualifying a gain result:
process==model.typ && temperature<100
gain>0 && gain<100 && load_cap<10f
vdd_value!=27
Table name
The Table name field is used for specifying the name of the file to which the table will be
written. The table name displayed in this field when the form is first presented (generated in
the environment) is the name that will be used if OK is clicked. This table file name, which
can be changed by clicking in the field and editing the text, is prefixed with the cell name (to
make sure it’s unique) and used in the $table_model function call (see “Interpolating with
Table Models” in the Cadence Verilog-A Language Reference) in the behavioral model.
Table file names must be unique to avoid collisions. Different table files can be created for
different conditions using the same behavioral model. Unique table file names are required
when data from different process models are varied in a sweep, and the data is passed into
the behavioral model from different look-up table files.
The Generate tab is used to perform the final phase of generating a calibrated behavioral
model. The lib/cell/view destination for the calibrated model is specified in the Library, Cell,
View, and File fields in the Calibrated model destination group box. The Library field
provides a drop-down list from which an available library can be selected.
The Use Spectre $table_model look-up function check box is used as follows:
■ When marked, the generated model is a Cadence-specific Verilog-AMS model, which
contains $table_model functions (see Interpolating with Table Models in the
Cadence Verilog-A Language Reference). This model choice is recommended.
■ When not marked, the generated model is a standard Verilog-AMS model that can be
simulated by any Verilog-AMS simulator (it contains no internally-generated look-up
tables). This standard model, which is slower and uses more memory, uses while loops
to perform the equivalent of a look-up operation.
The buttons of the Generate tab are described in the following table.
Button Description
Generate Generates the calibrated model and its required look-up tables
View Output Opens status messages resulting from the model calibration in a
read-only text editor window
View Model Opens the generated model in a read-only text editor window
View Tables Opens the View Table File form for selecting a table file for viewing in
a read-only text editor window
Export to Plan Exports the Perl used to generate the model to an existing plan
selected from the drop-down list in the Plan field on the Plan
Selection form:
The plan (Perl script), when executed, extracts result data from the
simulation results of the transistor-level model and generates the
calibrated behavioral model with its required look-up tables of data.
The following Perl extensions are used to extract result data for passing into the behavioral
model or for creating a look-up table:
■ acv_calibrate_model on page 334
■ acv_create_result_table on page 335
■ acv_get_res_value on page 337
■ acv_translate_to_standard_verilogams_model on page 338
Note: The created Perl script exported to a plan can be edited if unsupported Perl features
are required.
acv_calibrate_model
error = acv_calibrate_model("input_model", "module_name", "output_model",
"calibration_string*");
Description
Arguments
Value Returned
Example
acv_calibrate_model("$output_dir/$model_file.orig", "nd2",
"$output_lib/$output_cell/$output_view/$output_file", "a_incap=nd2_a_incap.vat",
"b_incap=nd2_b_incap.vat", "a_y_rise_t1=nd2_a_y_rise_t1.vat",
"a_y_rise_delay=nd2_a_y_rise_delay.vat", "a_y_rise_t2=nd2_a_y_rise_t2.vat",
"a_y_fall_t1=nd2_a_y_fall_t1.vat", "a_y_fall_delay=nd2_a_y_fall_delay.vat",
"a_y_fall_t2=nd2_a_y_fall_t2.vat", "b_y_rise_t1=nd2_b_y_rise_t1.vat",
"b_y_rise_delay=nd2_b_y_rise_delay.vat", "b_y_rise_t2=nd2_b_y_rise_t2.vat",
"b_y_fall_t1=nd2_b_y_fall_t1.vat", "b_y_fall_delay=nd2_b_y_fall_delay.vat",
"b_y_fall_t2=nd2_b_y_fall_t2.vat");
acv_create_result_table
void acv_create_result_table("result_file", "table_file", "result",
"variable*", "[condition*]", all, flat, $errmes);
Description
Creates a look-up table from a result file using specified variables and optional conditions.
Arguments
Value Returned
Example
acv_create_result_table("$result_dir/results/s2-t2.res",
"$wrap_dir/a_y_r_del.vat", "delay_ar", "temperature vdd_value",
"process==model.min", "0", "1",$errmes);
acv_get_res_value
char *acv_get_res_value("result_file", "result", "[condition*]", "key_num",
"res_num", $errmes);
Description
Arguments
Value Returned
Example
acv_get_res_value("$result_dir/results/s2-t2.res", "delay_af",
"load_cap>10f",1,1,$errmes);
acv_translate_to_standard_verilogams_model
void acv_translate_to_standard_verilogams_model( "input_file", "output_file");
Description
Arguments
Value Returned
Example
acv_translate_to_standard_verilogams_model( "vco8.va.orig", "vco8.va");
13
Developing and Editing a Plan
Overview
A plan is a sequence of steps required to complete a particular task. For example,
characterizing a design might involve running several tests or sweeps. These tasks can be
grouped together to form a characterization plan. These steps are “recorded” in a Perl script.
The Virtuoso® Specification-driven Environment (SdE) provides extensions to Perl to support
the following tasks (see “Perl Extensions” in the Virtuoso Specification-driven
Environment Reference):
■ Specifying sweeps and corners
■ Running tests, sweeps and corners
■ Accessing result data
■ Creating look-up tables from result data
■ Creating calibrated models
■ Adding results to a results database
Important
Plan creation is for advanced users who are comfortable programming in Perl.
Information specified in the environment windows can be exported into plans, and
familiarity with programming in Perl is useful in understanding and modifying the
created plan. A central IP group can create plans that are shared by several
designers.
Creating a Plan
To create a new plan, do one of the following:
1. In the VSdE main window, choose Add – New Plan.
The New Plan form appears.
2. In the Plan name field, type a name for the new plan.
3. Click OK.
A plan template appears in a text editor window (see “Text Editor” in the Virtuoso
Specification-driven Environment Reference). The plan template sets the project
directory, opens the project, and loads all project and workspace parameters.
Exporting to a Plan
Project files (for example, tests, sweeps, model calibrations, spec sheets) can be exported
into a plan. For example, to add a test called mytest to be run from a plan, do the following:
1. Return to the Files tab and right-click on the test mytest.
2. Select Export to Plan from the right-click menu.
3. On the Plan Selection form, select the plan from the current set of created plans.
4. Click OK.
The following lines are added to the plan to run the newly exported test:
$status=acv_run_test("Name_of_Test")
printlog "Error: Test Name_of_Test failed if status"
Information from one or more tests, sweeps, corners analyses, spec sheets, and model
calibrations can be added to a plan.
Note: The contents of tests and spec sheets are not copied to the plan. Instead, a reference
is created to run the test or spec sheet using the acv_run_test or acv_run_spec_sheet
Perl command (see “Perl Extensions” in the Virtuoso Specification-driven Environment
Reference). Sweeps, corners analyses, and model calibrations are copied to plans and can
be modified within those plans. An example plan is provided in
$ACV_ROOT/example_projects/digital_example/plans (see
example_plan.pl).
Once everything has been added/exported to a plan, the plan can be modified in any text
editor. Perl extensions (see “Perl Extensions” in the Virtuoso Specification-driven
Environment Reference) can be used, as well as any commands available in Perl.
Important
Once the various project items (tests, sweeps) have been exported to a plan, the
plan contains a copy of each of those project items. If the plan is changed, those
changes are not reflected back to the item that was copied. Also, any changes made
to the item in the environment are not reflected in the plan, unless the item is
removed from the plan and re-exported.
Linking to a Plan
Tests, sweeps, and corners analyses can be called by reference from within a plan. Link to
Plan, on the right-click pop-up menu for a project file, is used to create a reference to the
selected project item (test, sweep, or corners analysis). The contents of these project items
are not copied to the plan. Instead, a reference is created to run the item using a Perl
command specific to the project item (see “Perl Extensions” in the Virtuoso
Specification-driven Environment Reference).
For example, to create a reference to a sweep called mysweep to be run from a plan, do the
following:
1. Return to the Files tab and right-click on the sweep mysweep.
2. Select Link to Plan from the right-click menu.
3. On the Plan Selection form, select the plan from the current set of created plans.
4. Click OK.
The following lines are added to the plan to run the specified sweep:
$status=acv_run_sweep("Name_of_Sweep")
printlog "Error: Test Name_of_Sweep failed if status"
Note: Information from one or more tests, sweeps, corners analyses, spec sheets, and
model calibrations can be added to a plan by choosing Export to Plan from the right-click
pop-up menu (see “Exporting to a Plan” on page 341).
use ACV;
open_project(“projectDir“, “projectFile“);
load_project_params();
load_workspace_params();
@plan_params = get_param_list();
sub run {
planBody
}
run_plan();
where planBody can consist of any of the Perl extensions supported in the environment
(see “Perl Extensions” in the Virtuoso Specification-driven Environment Reference),
as well as any valid Perl commands and constructs. The Perl commands provided to support
hierarchical plans are listed in the table below:
Opening a Project
For example:
open_project("/home/user/Projects", "myProject.apf");
The opened project becomes the current project for loading parameters and other
operations.
Loading Parameters
To import project parameters from the current project and workspace parameters from the
current workspace value set, use the load_project_params and
load_workspace_params commands as follows:
load_project_params();
load_workspace_params();
The current project and workspace parameter set are specified when the plan is run (see
“Running Plans” on page 345). The imported parameters and their values are used during
plan execution. The scope of loaded parameters is limited to the current package.
Updating Parameters
To update the project and workspace parameters with the current values of these parameters
during plan execution, use the update_project_params and
update_workspace_params commands as follows:
update_project_params();
update_workspace_params();
Running Plans
To execute a named plan, optionally with specific parameter values, use the run_plan
command as follows:
run_plan(["planName", "planDir"[, paramList]]);
For example:
run_plan("ota_verify.pl", "/home/user/plans/ota", ’$vdd’, 3.3, ’$vss’, 0);
The run_plan command returns a handle to the named plan which can be used with the
get_plan_param and get_param_list commands (see “Using Parameters” on
page 346). For example:
$handle = run_plan("ota_verify.pl", "/home/user/plans/ota", ’$vdd’, 3.3, ’$vss’,
0);
Using Parameters
To get the value of a parameter from another plan for use in the current plan, use the
get_plan_param command in conjunction with the run_plan command as follows:
$handle = run_plan("mySynth.pl", "analyticFilter");
$p1 = get_plan_param(handle, "parameter");
The run_plan command returns a handle to a named plan which is, in turn, used by the
get_plan_param command to specify the plan from which the parameter is to be imported.
The get_plan_param command returns the value of a single, named project or workspace
parameter from the specified plan execution.
handle Perl $ variable that holds the value returned by the run_plan
command
parameter Name of a parameter whose value is to be imported for use by
the current plan
Note: For information on the run_plan command, see “Running Plans” on page 345.
For example:
$handle = run_plan("mySynth.pl", "analyticFilter");
$p1 = get_plan_param($handle, ’$p1’);
To get the complete list of project and workspace parameter names from the specified plan
execution, use the get_param_list command as follows:
get_param_list([handle]);
For example:
$ref = run_plan("myPlan.pl", "vco");
@vco_params = get_param_list($ref);
When used without the handle argument, the get_param_list command returns the
complete list of project and workspace parameter names from the current plan:
@param_list = get_param_list();
Note: For information about loading parameters, see “Loading Parameters” on page 344.
The following plan is used to export parameters to MATLAB for results post-processing. This
plan requires a full MATLAB license and software from The MathWorks, Inc.
# Perl plan 'param_to_MATLAB.pl' created on …
package Param_to_MATLAB;
sub run {
acv_run_matlab_cmds("plot(vdd_value);");
The Perl editor has “smart” qualities that show color coding for the following items:
■ Comments are green
■ Text is in black
■ Perl keywords are in blue
Tip
Plans that must prompt for user input can use Perl/Tk to provide GUI input windows.
Perl/Tk is an extension for creating graphical user interfaces and building graphical,
event-driven applications. See any commercially available Perl/Tk tutorial or
reference to learn how to implement and configure Perl/Tk graphical elements.
Important
If netlisting is requested or required, and the netlister
directory is not set or is set incorrectly, netlisting will fail.
-batch Causes the tools to be run in batch mode
For example:
acvperl -ws ~/workspaces/new_workspace/my_workspace.awf -wsparams DefaultSet
-project project1 -plan ~/workspaces/new_workspace/plans/my_plan.pl -batch
Typically, plans are run from the project directory; files referenced in the plan typically have
paths that are relative to the project directory. However, plans can be run from anywhere as
long as the -ws and -plan named arguments are used to specify the paths to the workspace
and plan. The path to the plan file (-plan) can be relative to the project directory of the
project name you specify using the -project argument or it can be a full path.
Also, plans can run other plans from different projects as long as the projects are all part of
the same workspace.
14
Learning by Example
3. Copy and extract the example tar file by typing the following commands at the system
prompts:
cp $ACV_ROOT/samples/acv/overview.tar.Z .
uncompress overview.tar.Z
tar xvf overview.tar
Field Selection
Library Name acvPllLib
Cell Name vco8Top
View Name schematic
Note: If acvPllLib does not appear in the list of Library Name choices, then verify that
there is an entry for acvPllLib in the cds.lib in the directory where the DFII
environment was started.
4. Click OK.
3. Click OK.
The New Workspace form appears.
4. Click in the Name field and type a name for the workspace.
Note: The Name is automatically appended to the Location so that a directory of that
Name will be created for the workspace. The Create new project check box is marked.
5. Click OK.
The first screen of the New Project Wizard appears.
6. Verify that the No radio button is selected and click Next.
7. Click Finish.
The Create Test form appears. The lib/cell/view information of the design is displayed in
the fields in the Design location group box. By default, the name of the test (which
appears in the Test name field at the top of the form) matches the name of the cell (in
this case, vco8Top). Because this example demonstrates a test setup for the Spectre
circuit simulator using VSdE Native integration, no changes are required on this form.
8. On the Create Test form, click OK.
The Test Setup window appears. The lib/cell/view information of the design is displayed in the
fields on the Design tab.
Creating a Test
The following steps are demonstrated:
■ Specifying Include Information on page 354
■ Getting Global Design Variable Information from the Schematic on page 354
■ Specifying Analyses on page 355
■ Specifying Test Measures on page 355
New project parameters (from the Value Used column), and their default values (specified in
the Value column of the Add Parameter form), are added to the Project Parameters table
on the Parameters tab of the VSdE main window. The Current Value column on the Design
Vars tab displays the evaluated value of each design variable.
Specifying Analyses
To define transient and DC operating point analyses for the test, do the following:
1. Click the Analyses tab.
2. Click to mark the Transient check box.
3. In the Stop time field, type $tran_duration.
4. Click to mark the DC OP check box.
5. Click Apply.
max_freq
min_freq
6. Click Apply.
vcogain
Upon successful completion of these tasks, the word passed appears in the Status field.
To view the measure results, and to view and back-annotate operating point information, do
the following:
1. On the Run tab of the Test Setup window, click to display the Results tab.
Measure results are displayed in the Measures table.
2. Click Op Point.
The Operating Point window appears.
3. In the Results type field, select DC OpPt from the drop-down menu.
4. In the Component type field, select bsim3v3 from the drop-down menu.
A list of bsim3v3 component types from this design appears in the table at the bottom of the
window, along with DC operating point information for each device.
To display only those devices whose vgs parameter is greater than zero, do the following:
1. In the Filter group box, mark the Parameter expression check box.
2. Type vgs>0 in the field provided and press Enter.
The list of displayed components can be filtered further by selecting only those components
in the /I16/I5 hierarchy whose vgs parameter is greater than zero, as follows:
1. In the Filter group box, mark the Component filter check box.
2. Select By pattern.
3. In By pattern field, type /I16/I5/*.
4. Press Enter.
To create an operating point expression and add it to the Annotate list, do the following in the
Operating point expression group box:
1. In the Name field, type myExpr.
2. In the Expression field, type vds-vdsat.
3. Click Add.
The new expression name (myExpr) appears in the Annotate list.
4. Click OK.
The following changes are reflected in the table at the bottom of the Operating Point window:
■ region column is gone
■ cgb column has been added
■ myExpr column has been added
Note: Any changes made on the Operating Point Parameter and Expression Setup form are
available during the current Test Setup session. The Save button can be used to save
changes with the test so that these changes are available the next time the test is opened.
For this example, the changes are only retained for the current Test Setup session. For more
information, see “Setting Up Component Parameters and Building Expressions” on page 272.
8. Click Apply.
9. Click Run.
The Job Status tab is displayed to show the progress of the sweep. To display measured
results when the job is finished, do the following:
1. Click the Results tab.
2. Click View.
Measured results at each temperature are displayed in the Results table window.
Tip
Use the Tab key to move from one table cell to the next.
9. Click Apply.
7. Click to mark the Specify sweep and test name check box.
8. In the Sweep field, select processTempVdd from the drop-down list.
9. In the Test field, select vco8Top from the drop-down list.
10. Click OK.
To fill the table with the remaining measures for this sweep and test, define the minimum and
maximum specifications for each, and perform a comparison of the measure values with the
specifications, do the following:
1. Click Expand.
Tip
Use the buttons to resize rows and columns to fit the data they contain.
2. Specify the following values in the Min Spec and Max Spec columns for each measure
as indicated:
3. Click Compare.
The measure values are compared to the Min Spec and Max Spec values. The results of
this comparison appear in the Pass/Fail column as either Pass or Fail.
To examine the measure that failed to fall within the specified ranges, do the following:
1. At the beginning of the third row in the Specifications table, click the row heading (3) to
select this row of data (which shows a Fail in the Pass/Fail column).
2. On the toolbar, click .
The details page for this row of data appears. The Pass/Fail column indicates which
corner or corners failed (Fail).
3. In the details page, click Close.
To save the main spec sheet page as an HTML document within the project:
1. On the toolbar, click .
An informational window appears.
2. Click OK.
3. In the Spec window, click OK.
4. On the Documents folder in the VSdE main window, click +.
The cornerVsSpec HTML document was added to the project.
To run the Monte Carlo analysis and plot results, do the following:
1. In the Sweep window, click Run.
The Job Status tab appears. Upon successful completion of the analysis, the word
passed appears in the Status field.
2. Click the Results tab.
3. Click View.
The measured results at each Monte Carlo iteration appear in the Results table window.
4. Click Waveform.
The Select Waveform Files form appears for plotting results.
5. Click Plot.
Note: When the Plot as family of curves check box is not marked, only the waveform for a
single iteration is plotted. The iteration is selected in the Results table window. Multiple
columns in the table can be selected using Shift-click, click-drag, or Ctrl-click.
Histogram and scatter plots are available using the MC Plot menu on the Results tab of the
Sweep window. See “Results Tab” on page 221 for more information.
Index
A annotation
display options 270
AC analysis instance 269
plotting results 159 node 271
AC modifiers 159 Annotation Display Options form 270
ACV_DEFAULT_PROJECT_CREATE .apf project file 52
environment variable 54 auto-plot
ACV_DEFAULT_TEST_TEMPLATE measure results 142
environment variable 109 waveforms 278
ACV_DEFAULT_WORKSPACE_PATH .awf workspace file 52
environment variable 55
ACV_FORCE_NON_ALTERABLE
environment variable 206 B
ACV_POSTRUN_CMD environment
variable 229 back-annotation
ACV_PRERUN_CMD environment instance 269
variable 229 node 271
ACV_PROJECT_PATH environment BSUB_QUIET environment variable 236
variable 61 builtins library 134 to 140
ACV_RAW_PATH environment variable 240 ipwl_analog 135
ACV_SIMRUN_RSH environment ipwl_digital 138
variable 230 ipwl_file 136
ACV_SPECTRE_MPS_TIMEOUT istaircase 140
environment variable 225 vpwl_analog 135
ACV_SPLASH environment variable 22 vpwl_digital 138
ACV_XTERM environment variable 154, vpwl_file 136
220 vstaircase 140
acvMdumpRes 147 BUTTON (.config) 172
acvMeasResult 147
acvperl command 349
Add Design Component form 132 C
Add Parameter form 95, 117
adding Calculator (measures) 144
new tests 105 calibrating behavioral models 42,
ADE Calculator Expression 144 317 to 338
ADE integration 163 CDS_Netlisting_Mode environment
ADE state file export 33 variable 164
ADE state file import 28, 30 Cell Formatting window 260
ADE Test Setup 163 Check In form 74
Create Test form 165 group of items 76
Test Setup window 166 multiple related files 76
Advanced Settings for Tables form 329 single item 75
Alterable 205 Check Out File form 86
Analyses tab Checkpoint form 79
Spectre 196 colors
ANALYSIS_TYPE (.config) 173 color-flashing problems 21
load_workspace_params 344
open_project 344
R
run_plan 345 raw file location 240
update_project_params 345 RAW File Location form 245
update_workspace_params 345 read-only mode 57
Perl/Tk 348 reference documents 15
plans 43, 339 to 350 related documents 15
creating 341 remote jobs
editing 348 launching 229
exporting to 341 renaming a test 34
getting parameters 346 Result list vs. result list check box 329
hierarchical 343, 347 Result vs. spec 298
linking to 342 results 247, 249
loading parameters 344 comma-separated values 157
log file 348 corners 221
messages 348 displaying leaf data 253
opening a project 344 expanding a folder 250
overview 339 export HTML 157
running 345, 348 flatten/unflatten branch 252
running from the system prompt 349 Monte Carlo 221
stopping, terminating 348 opening a data file 252
updating parameters 345 opening an association 252
Plot actions operating point 263
Plot analysis 158 plotting traces 253
Plot analysis AC (modifiers) 159 print setup 157
Plot types 160 S-parameter data 160
Plot Results window 283 sweep 221
for MATLAB plotting 284 table, see also Results table
Options button 285 window 255
plot window menus 292 test 155
project 23, 51 to 69 types 251
adding 58 viewing and plotting 247
creating 58 viewing Tabular Data 254
design files 62 waveforms 278
directory contents 52 Results Statistics window 276
drag-and-drop 35 Results tab
moving 35 Sweep window 221
new 60 Test Setup window, Run tab 155
new, from previous 61 VSdE main window 25, 249
project items 62 Results table window 255
tools overview 36 buttons 256
working library 52 File menu 258
Project File (.apf) file 52 format cells 260
Project menu hiding rows/columns 259
Workspace Value Set 100 limit values per cell 260
project parameters 91, 94 right-click pop-up menu 259
ADE tests 168 sorting rows/columns 259
Perl $ environment variables 91, 94 Run button 45
Properties form Run Options tab 225
Design Management 81 Run tab 150
W
waveform display 278
auto-plot 278
Waveforms tab (Run tab, Test Setup
window) 158
AC modifiers 159
workspace 23, 51 to 69
creating and opening 53
creating new 54
creating new workspace folders 35
directory contents 52
drag-and-drop 35
folders 35
moving 35
opening existing 56
opening recent 56
overview 51
parameters 95
template, import 68
template, new 67
templates 64
value set Add 100
value set Delete 100
value set Rename 100
value sets 95