Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Generate primitives with multiple aspect ratios and then use the placer to decide which to choose. #671

Merged
merged 32 commits into from
Apr 30, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
32 commits
Select commit Hold shift + click to select a range
00271d9
Set up test to work on multi-aspect primitives
stevenmburns Apr 27, 2021
879165b
Experiments to determine placeholder hierarchy to maintain the folded…
stevenmburns Apr 28, 2021
20e4fc2
Renaming templates to their concrete name
stevenmburns Apr 28, 2021
26982b0
Use gdsFile (cleaned up a bit) as the concrete template name
stevenmburns Apr 28, 2021
6efc6c2
Renamed template_name to abstract_template_name and concrete_template…
stevenmburns Apr 28, 2021
2776f1c
Generate some more primitives
stevenmburns Apr 28, 2021
dd78c8f
[skip CI] Ready for integration of an abstract to concrete map file w…
stevenmburns Apr 29, 2021
b2f8221
Merge branch 'master' into feature/one_to_many_map_file
stevenmburns Apr 29, 2021
ce9ec32
[skip CI] Added map from string to vector<string> to represent the ne…
stevenmburns Apr 29, 2021
9f01d19
[skip CI] More changes
stevenmburns Apr 29, 2021
e95a4e7
read multiple aspect ratio into gdsdata2 and block instances
854768750 Apr 29, 2021
20ec0c0
merge
854768750 Apr 29, 2021
1cd6ac4
check lefmaster json
854768750 Apr 29, 2021
aba8237
[skip CI] Most stuff works; still need to fix up capacitors
stevenmburns Apr 29, 2021
7f614c3
Hack to fix broken capacitors
stevenmburns Apr 29, 2021
60ec256
Debug missing capacitor jsons
stevenmburns Apr 30, 2021
fb1d0f8
Merge branch 'master' into feature/one_to_many_map_file
stevenmburns Apr 30, 2021
540990e
[skip CI] Merge with master; broken beause of Pydantic restrictions
stevenmburns Apr 30, 2021
16dcf73
Merge branch 'master' into feature/one_to_many_map_file
parijatm Apr 30, 2021
d3c7a60
[Merge fixes]
parijatm Apr 30, 2021
a14d809
Merge branch 'feature/one_to_many_map_file' of github.com:ALIGN-analo…
stevenmburns Apr 30, 2021
5354470
Cleanup
stevenmburns Apr 30, 2021
f90889a
Removed broken mostly disabled test
stevenmburns Apr 30, 2021
6b07cf1
More refactoring
stevenmburns Apr 30, 2021
8480ad1
Better pattern matching; better handling of Dcaps
stevenmburns Apr 30, 2021
b749441
Better error messages
stevenmburns Apr 30, 2021
a61f323
Fixed case of unused primitives
stevenmburns Apr 30, 2021
599a5d3
Removed a redundant data structure
stevenmburns Apr 30, 2021
da60b3b
Make -rp work again.
stevenmburns Apr 30, 2021
2106354
Remove unneeded test
stevenmburns Apr 30, 2021
39ab7d9
Add a rendering test
stevenmburns Apr 30, 2021
b2be114
Remove telescopic_ota_guard_ring from CI
stevenmburns Apr 30, 2021
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,7 @@ workflows:
pdk:
- "Finfet"
design:
- "adder or switched_capacitor_filter or high_speed_comparator or telescopic_ota_guard_ring"
- "adder or switched_capacitor_filter or high_speed_comparator"
- test-integration:
name: "test-integration-cp38-<<matrix.pdk>>"
requires:
Expand All @@ -224,7 +224,7 @@ workflows:
pdk:
- "Finfet"
design:
- "not SAR and not DLL and not single_to_differential_converter and not COMPARATOR_2LEVEL_BIDIRECTIONAL_MAC_SKEW and not CTD"
- "not SAR and not DLL and not single_to_differential_converter and not COMPARATOR_2LEVEL_BIDIRECTIONAL_MAC_SKEW and not CTD and not telescopic_ota_guard_ring"
- build-wheel:
name: "build-wheel-<<matrix.platform>>"
requires:
Expand Down
2 changes: 1 addition & 1 deletion PlaceRouteHierFlow/PnR-pybind11.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -457,7 +457,7 @@ PYBIND11_MODULE(PnR, m) {
.def( "ReadConstraint_Json", &PnRdatabase::ReadConstraint_Json)
.def_readwrite("hierTree", &PnRdatabase::hierTree)
.def_readwrite("topidx", &PnRdatabase::topidx)
.def_readwrite("gdsData", &PnRdatabase::gdsData)
.def_readwrite("gdsData2", &PnRdatabase::gdsData2)
.def_readwrite("DRC_info", &PnRdatabase::DRC_info)
;

Expand Down
116 changes: 68 additions & 48 deletions PlaceRouteHierFlow/PnRDB/PnRdatabase.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ static bool EndsWith( const string& str, const string& pat)

PnRdatabase::~PnRdatabase() {
auto logger = spdlog::default_logger()->clone("PnRDB.PnRdatabase.~PnRdatabase");
logger->info( "Deconstructing PnRdatabase");
logger->debug( "Deconstructing PnRdatabase");
}

deque<int> PnRdatabase::TraverseHierTree() {
Expand Down Expand Up @@ -2170,66 +2170,86 @@ hierTree[i].Terminals[hierTree[i].Nets[j].connected[k].iter].netIter = j;
}


static string stem(const string& s) {

unsigned int start = 0;
unsigned int slash = s.find_last_of( '/');
if ( slash != string::npos) {
start = slash + 1;
}

unsigned int end = s.size();
unsigned int dot = s.find_last_of( '.');
if ( dot != string::npos) {
end = dot;
}

// xx/y.d
// ^ ^
// 012345

return s.substr( start, end-start);
}

bool PnRdatabase::MergeLEFMapData(PnRDB::hierNode& node){

auto logger = spdlog::default_logger()->clone("PnRDB.PnRdatabase.MergeLEFMapData");

bool missing_lef_file = 0;

logger->info("merge LEF/map data on node {0}", node.name);
for(unsigned int i=0;i<node.Blocks.size();i++){
const string& master=node.Blocks[i].instance.back().master;
if(lefData.find(master)==lefData.end()) {
// LEF is missing; Ok if a cap or if not a leaf
if(master.find("Cap")!=std::string::npos ||
master.find("cap")!=std::string::npos) continue;
if(node.Blocks[i].instance.back().isLeaf) {
logger->error("The key does not exist in map: {0}",master);
missing_lef_file = 1;
}
continue;
logger->info("merge LEF/map data on node {0}", node.name);
for (unsigned int i = 0; i < node.Blocks.size(); i++) {
const string abstract_template_name = node.Blocks[i].instance.front().master;

if (gdsData2.find(abstract_template_name) == gdsData2.end()) {
if (abstract_template_name.find("Cap") != std::string::npos || abstract_template_name.find("cap") != std::string::npos || !node.Blocks[i].instance.back().isLeaf) continue;
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@854768750 I just fir checked if the abstract_template_name was in gdsData2. If not, I'll skip the whole process if cap is in the name.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently this does not cause error on capacitor because capplacer is not functioning.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is fine because, seems in BUFFER_VREFP1, Dcap is just like other ordinary primitives.

logger->error("The key does not exist in map: {0}", abstract_template_name);
}

//cout<<node.Blocks[i].instance.back().name<<" "<<master<<endl;
for(unsigned int w=0;w<lefData[master].size();++w) {
if(node.Blocks[i].instNum>0) { node.Blocks[i].instance.push_back( node.Blocks[i].instance.back() ); }
node.Blocks[i].instNum++;
node.Blocks[i].instance.back().width=lefData[master].at(w).width;
node.Blocks[i].instance.back().height=lefData[master].at(w).height;
node.Blocks[i].instance.back().lefmaster=lefData[master].at(w).name;
node.Blocks[i].instance.back().originBox.LL.x=0;
node.Blocks[i].instance.back().originBox.LL.y=0;
node.Blocks[i].instance.back().originBox.UR.x=lefData[master].at(w).width;
node.Blocks[i].instance.back().originBox.UR.y=lefData[master].at(w).height;
node.Blocks[i].instance.back().originCenter.x=lefData[master].at(w).width/2;
node.Blocks[i].instance.back().originCenter.y=lefData[master].at(w).height/2;

for(unsigned int j=0;j<lefData[master].at(w).macroPins.size();j++){

unsigned int variants_count = gdsData2[abstract_template_name].size();
node.Blocks[i].instance.resize(variants_count);
for (unsigned int j = 1; j < variants_count; j++) node.Blocks[i].instance[j] = node.Blocks[i].instance[0];
node.Blocks[i].instNum = variants_count;
for (unsigned int j = 0; j < variants_count; j++) {
auto& b = node.Blocks[i].instance[j];
b.gdsFile = gdsData2[abstract_template_name][j];
string a_concrete_template_name = stem(b.gdsFile);
if (lefData.find(a_concrete_template_name) == lefData.end()) {
logger->error("No LEF file for a_concrete_template_name {0}", a_concrete_template_name);
missing_lef_file = 1;
continue;
}
auto& lef = lefData.at(a_concrete_template_name).front();
b.interMetals = lef.interMetals;
b.interVias = lef.interVias;
// node.Blocks[i].instNum++;
b.width = lef.width;
b.height = lef.height;
b.lefmaster = lef.name;
b.originBox.LL.x = 0;
b.originBox.LL.y = 0;
b.originBox.UR.x = lef.width;
b.originBox.UR.y = lef.height;
b.originCenter.x = lef.width / 2;
b.originCenter.y = lef.height / 2;

for (unsigned int k = 0; k < b.blockPins.size(); k++) {
bool found = 0;
for(unsigned int k=0;k<node.Blocks[i].instance.back().blockPins.size();k++){
if(lefData[master].at(w).macroPins[j].name.compare(node.Blocks[i].instance.back().blockPins[k].name)==0){
node.Blocks[i].instance.back().blockPins[k].type = lefData[master].at(w).macroPins[j].type;
node.Blocks[i].instance.back().blockPins[k].pinContacts = lefData[master].at(w).macroPins[j].pinContacts;
node.Blocks[i].instance.back().blockPins[k].use = lefData[master].at(w).macroPins[j].use;
for (unsigned int m = 0; m < lef.macroPins.size(); m++) {
if (lef.macroPins[m].name.compare(b.blockPins[k].name) == 0) {
b.blockPins[k].type = lef.macroPins[m].type;
b.blockPins[k].pinContacts = lef.macroPins[m].pinContacts;
b.blockPins[k].pinVias = lef.macroPins[m].pinVias;
b.blockPins[k].use = lef.macroPins[m].use;
found = 1;
}
}
if(found == 0){
node.Blocks[i].instance.back().blockPins.push_back(lefData[master].at(w).macroPins[j]);
break;
}
}
if (found == 0) logger->error("Block {0} pin {1} not found in lef file", b.name, b.blockPins[k].name);
}

node.Blocks[i].instance.back().interMetals = lefData[master].at(w).interMetals;
node.Blocks[i].instance.back().interVias = lefData[master].at(w).interVias;
node.Blocks[i].instance.back().gdsFile=gdsData[lefData[master].at(w).name];
//cout<<"xxx "<<node.Blocks[i].instance.back().gdsFile<<endl;
}


assert(!missing_lef_file);
}

assert( !missing_lef_file);

return 1;

}
2 changes: 1 addition & 1 deletion PlaceRouteHierFlow/PnRDB/PnRdatabase.h
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ class PnRdatabase
int unitScale;
map<string, vector<PnRDB::lefMacro> > lefData; //map from Macro name to Macro Instance
public:
map<string, string> gdsData; //map from gds name to gds file
map<string, vector<string> > gdsData2; //map from gds name to multiple gds file (abstract to multiple concrete)
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@854768750 I added this as the C++ data structure of the new mapping.

private:
PnRDB::designRule drData;

Expand Down
4 changes: 2 additions & 2 deletions PlaceRouteHierFlow/placer/ILP_solver.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -601,8 +601,8 @@ void ILP_solver::PlotPlacement(design& mydesign, SeqPair& curr_sp, string outfil

int bias = 50;
int range = std::max(UR.x, UR.y) + bias;
fout << "\nset xrange [" << -range << ":" << range << "]" << endl;
fout << "\nset yrange [" << 0 - bias << ":" << range << "]" << endl;
fout << "\nset xrange [" << LL.x - bias << ":" << UR.x + bias << "]" << endl;
fout << "\nset yrange [" << LL.y - bias << ":" << UR.y + bias << "]" << endl;
// set labels for blocks
for (int i = 0; i < mydesign.Blocks.size(); ++i) {
placerDB::point tp;
Expand Down
61 changes: 60 additions & 1 deletion align/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
import shutil
import os
import json
import re
import copy
from collections import defaultdict

from .compiler import generate_hierarchy
from .primitive import generate_primitive
Expand Down Expand Up @@ -33,6 +36,59 @@ def build_steps( flow_start, flow_stop):
return steps_to_run


def gen_more_primitives( primitives, topology_dir, subckt):
Copy link
Collaborator Author

@stevenmburns stevenmburns Apr 30, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kkunal1408 @arvuce22 This is the code that needs to be moved upstream in the flow, either topology, or primitives or both.

"""primitives dictiionary updated in place"""

map_d = defaultdict(list)

# As a hack, add more primitives if it matches this pattern
p = re.compile( r'^(\S+)_nfin(\d+)_n(\d+)_X(\d+)_Y(\d+)(|_\S+)$')

more_primitives = {}

for k,v in primitives.items():
m = p.match(k)
if m:
logger.info( f'Matched primitive {k}')
nfin,n,X,Y = tuple(int(x) for x in m.groups()[1:5])
abstract_name = f'{m.groups()[0]}_nfin{nfin}{m.groups()[5]}'
map_d[abstract_name].append( k)
if X != Y:
concrete_name = f'{m.groups()[0]}_nfin{nfin}_n{n}_X{Y}_Y{X}{m.groups()[5]}'
map_d[abstract_name].append( concrete_name)
if concrete_name not in primitives and \
concrete_name not in more_primitives:
more_primitives[concrete_name] = copy.deepcopy(v)
more_primitives[concrete_name]['x_cells'] = Y
more_primitives[concrete_name]['y_cells'] = X
else:
logger.warning( f'Didn\'t match primitive {k}')
map_d[k].append( k)

primitives.update( more_primitives)

concrete2abstract = { vv:k for k,v in map_d.items() for vv in v}

for k,v in primitives.items():
v['abstract_template_name'] = concrete2abstract[k]
v['concrete_template_name'] = k

# now hack the netlist to replace the template names using the concrete2abstract mapping

with (topology_dir / f'{subckt}.verilog.json').open( 'rt') as fp:
verilog_json_d = json.load(fp)

for module in verilog_json_d['modules']:
for instance in module['instances']:
t = instance['template_name']
if t in concrete2abstract:
del instance['template_name']
instance['abstract_template_name'] = concrete2abstract[t]

with (topology_dir / f'{subckt}.verilog.json').open( 'wt') as fp:
json.dump( verilog_json_d, fp=fp, indent=2)


def schematic2layout(netlist_dir, pdk_dir, netlist_file=None, subckt=None, working_dir=None, flatten=False, unit_size_mos=10, unit_size_cap=10, nvariants=1, effort=0, check=False, extract=False, log_level=None, verbosity=None, generate=False, python_gds_json=True, regression=False, uniform_height=False, render_placements=False, PDN_mode=False, flow_start=None, flow_stop=None):

steps_to_run = build_steps( flow_start, flow_stop)
Expand Down Expand Up @@ -92,6 +148,9 @@ def schematic2layout(netlist_dir, pdk_dir, netlist_file=None, subckt=None, worki
if '1_topology' in steps_to_run:
topology_dir.mkdir(exist_ok=True)
primitives = generate_hierarchy(netlist, subckt, topology_dir, flatten, pdk_dir, uniform_height)

gen_more_primitives( primitives, topology_dir, subckt)

with (topology_dir / 'primitives.json').open( 'wt') as fp:
json.dump( primitives, fp=fp, indent=2)
else:
Expand All @@ -111,7 +170,7 @@ def schematic2layout(netlist_dir, pdk_dir, netlist_file=None, subckt=None, worki
pnr_dir = working_dir / '3_pnr'
if '3_pnr' in steps_to_run:
pnr_dir.mkdir(exist_ok=True)
variants = generate_pnr(topology_dir, primitive_dir, pdk_dir, pnr_dir, subckt, nvariants=nvariants, effort=effort, check=check, extract=extract, gds_json=python_gds_json, render_placements=render_placements, PDN_mode=PDN_mode)
variants = generate_pnr(topology_dir, primitive_dir, pdk_dir, pnr_dir, subckt, primitives=primitives, nvariants=nvariants, effort=effort, check=check, extract=extract, gds_json=python_gds_json, render_placements=render_placements, PDN_mode=PDN_mode)
results.append( (netlist, variants))
assert len(variants) >= 1, f"No layouts were generated for {netlist}. Cannot proceed further. See LOG/align.log for last error."

Expand Down
68 changes: 13 additions & 55 deletions align/pnr/build_pnr_model.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
import json
import re
from itertools import chain
from collections import defaultdict

from .. import PnR
from ..schema.hacks import VerilogJsonTop
Expand All @@ -12,56 +13,6 @@

NType = PnR.NType

def analyze_hN( tag, hN, beforeAddingBlockPins=False):
logger.info( f'{tag} name {hN.name}')

logger.info( f'Nets and PowerNets')
for net in chain( hN.Nets, hN.PowerNets):
logger.info( f' {net.name}')
for conn in net.connected:
if conn.type == NType.Block:
if 0 <= conn.iter2 < len(hN.Blocks):
blk = hN.Blocks[conn.iter2]
inst = blk.instance[0]

if 0 <= conn.iter < len(inst.blockPins):

logger.info( f' {conn.type} {conn.iter} ({inst.blockPins[conn.iter].name}) {conn.iter2} ({inst.name} {inst.master})')
else:
logger.info( f' {conn.type} {conn.iter} (<out of range>) {conn.iter2} ({inst.name} {inst.master})')

else:
logger.info( f' {conn.type} {conn.iter} (<unknown>) {conn.iter2} (<out of range>)')
elif conn.type == NType.Terminal:
assert conn.iter2 == -1
if 0 <= conn.iter < len(hN.Terminals):
logger.info( f' {conn.type} {conn.iter} ({hN.Terminals[conn.iter].name})')
else:
logger.info( f' {conn.type} {conn.iter} (<out of range>)')

logger.info( f'PowerNets (second pass)')
for net in hN.PowerNets:
logger.info( f' {net.name}')
for conn in net.dummy_connected:
if 0 <= conn.iter2 < len(hN.Blocks):
blk = hN.Blocks[conn.iter2]
logger.info( f' blk.selectedInstance={blk.selectedInstance}')
for inst_idx,inst in enumerate(blk.instance):
if beforeAddingBlockPins:
if 0 <= conn.iter < len(inst.dummy_power_pin):
logger.info( f' {conn.iter} ({inst.dummy_power_pin[conn.iter].name}) {conn.iter2} ({inst.name} {inst.master}) inst_idx={inst_idx}')
else:
logger.info( f' {conn.iter} (<out of range>) {conn.iter2} ({inst.name} {inst.master}) inst_idx={inst_idx}')
else:
logger.info( f' {conn.iter} (<unknown>) {conn.iter2} (<out of range>)')

logger.info( f'Blocks')
for blk in hN.Blocks:
logger.info( f' blk.child={blk.child} len(blk.instance)={len(blk.instance)} blk.selectedInstance={blk.selectedInstance} blk.instNum={blk.instNum}')
for inst in blk.instance:
logger.info( f' inst.name={inst.name} inst.master={inst.master} len(inst.dummy_power_pin)={len(inst.dummy_power_pin)}')


def ReadVerilogJson( DB, j):
hierTree = []

Expand Down Expand Up @@ -91,7 +42,13 @@ def ReadVerilogJson( DB, j):
temp_blockComplex = PnR.blockComplex()
current_instance = PnR.block()

current_instance.master = instance['template_name']
if 'template_name' in instance:
current_instance.master = instance['template_name']
elif 'abstract_template_name' in instance:
current_instance.master = instance['abstract_template_name']
else:
assert False, f'Missing template_name (abstract or otherwise) in instance {instance}'

current_instance.name = instance['instance_name']

blockPins = []
Expand Down Expand Up @@ -148,15 +105,16 @@ def process_connection( iter, net_name):
def _ReadMap( path, mapname):
d = pathlib.Path(path)
p = re.compile( r'^(\S+)\s+(\S+)\s*$')
tbl = {}
tbl2 = defaultdict(list)
with (d / mapname).open( "rt") as fp:
for line in fp:
line = line.rstrip('\n')
m = p.match(line)
assert m
k, v = m.groups()
tbl[k] = str(d / v)
return tbl
tbl2[k].append( str(d / v))
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@854768750 Here tbl2 is the python version of the map from abstract name to list of concrete names.

logger.debug( f'expanded table: {tbl2}')
return tbl2

def _attach_constraint_files( DB, fpath):
d = pathlib.Path(fpath)
Expand Down Expand Up @@ -190,7 +148,7 @@ def PnRdatabase( path, topcell, vname, lefname, mapname, drname):
DB.ReadPDKJSON( path + '/' + drname)

_ReadLEF( DB, path, lefname)
DB.gdsData = _ReadMap( path, mapname)
DB.gdsData2 = _ReadMap( path, mapname)

j = None
if vname.endswith(".verilog.json"):
Expand Down
Loading