Re: [Gems-users] RING Topolgy


Date: Mon, 06 Apr 2009 14:12:27 -0500
From: Muhammad Shoaib <shoaibbinalt@xxxxxxxx>
Subject: Re: [Gems-users] RING Topolgy
 Thanks Greg for the reply.

But can you guide me where should I look for if I want to make my links to be unidirectional. Does the Routing algorithm needs to be changed or some parametric changes???

Muhammad Shoaib


----- Original Message -----
From: Greg Byrd <gbyrd@xxxxxxxx>
Date: Saturday, April 4, 2009 8:53 pm
Subject: Re: [Gems-users] RING Topolgy
To: Gems Users <gems-users@xxxxxxxxxxx>


> This is certainly a ring, but it won't be a unidirectional ring unless 
> you change how routing is done.  The links are bidirectional, and the 
> routing algorithm generates a shortest-path table.  So a packet will 
> go in the direction that is the shortest path to its destination.
> 
> I don't know if you want a unidirectional ring.  If not, I think 
> you're done!
> 
> ...Greg
> 
> 
> Muhammad Shoaib wrote:
> > Hi All, 
> > 
> > Will the following be a valid RING Topology, specified in a file  
> "RING_Procs-16_ProcsPerChip-16_L2Banks-16_Memories-16.txt"   ???
> > 
> > 
> > 
> > processors:16
> > procs_per_chip:16
> > L2banks:16
> > memories:16
> > 
> > ext_node:L1Cache:0 int_node:0 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:1 int_node:1 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:2 int_node:2 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:3 int_node:3 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:4 int_node:4 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:5 int_node:5 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:6 int_node:6 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:7 int_node:7 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:8 int_node:8 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:9 int_node:9 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:10 int_node:10 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:11 int_node:11 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:12 int_node:12 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:13 int_node:13 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:14 int_node:14 link_latency:1 bw_multiplier:72 
> > ext_node:L1Cache:15 int_node:15 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:0 int_node:0 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:1 int_node:1 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:2 int_node:2 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:3 int_node:3 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:4 int_node:4 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:5 int_node:5 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:6 int_node:6 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:7 int_node:7 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:8 int_node:8 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:9 int_node:9 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:10 int_node:10 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:11 int_node:11 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:12 int_node:12 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:13 int_node:13 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:14 int_node:14 link_latency:1 bw_multiplier:72 
> > ext_node:L2Cache:15 int_node:15 link_latency:1 bw_multiplier:72 
> > ext_node:Directory:0 int_node:0 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:1 int_node:1 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:2 int_node:2 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:3 int_node:3 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:4 int_node:4 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:5 int_node:5 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:6 int_node:6 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:7 int_node:7 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:8 int_node:8 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:9 int_node:9 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:10 int_node:10 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:11 int_node:11 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:12 int_node:12 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:13 int_node:13 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:14 int_node:14 link_latency:20 bw_multiplier:80 
> > ext_node:Directory:15 int_node:15 link_latency:20 bw_multiplier:80 
> > 
> > 
> > int_node:0 int_node:1 link_latency:1 bw_multiplier:72 
> > int_node:1 int_node:2 link_latency:1 bw_multiplier:72 
> > int_node:2 int_node:3 link_latency:1 bw_multiplier:72 
> > int_node:3 int_node:4 link_latency:1 bw_multiplier:72 
> > int_node:4 int_node:5 link_latency:1 bw_multiplier:72 
> > int_node:5 int_node:6 link_latency:1 bw_multiplier:72 
> > int_node:6 int_node:7 link_latency:1 bw_multiplier:72 
> > int_node:7 int_node:8 link_latency:1 bw_multiplier:72 
> > int_node:8 int_node:9 link_latency:1 bw_multiplier:72 
> > int_node:9 int_node:10 link_latency:1 bw_multiplier:72 
> > int_node:10 int_node:11 link_latency:1 bw_multiplier:72
> > int_node:11 int_node:12 link_latency:1 bw_multiplier:72 
> > int_node:12 int_node:13 link_latency:1 bw_multiplier:72 
> > int_node:13 int_node:14 link_latency:1 bw_multiplier:72 
> > int_node:14 int_node:15 link_latency:1 bw_multiplier:72 
> > int_node:15 int_node:0 link_latency:1 bw_multiplier:72  
> > 
> > 
> > Regards
> > 
> > 
> > _______________________________________________
> > Gems-users mailing list
> > Gems-users@xxxxxxxxxxx
> > https://lists.cs.wisc.edu/mailman/listinfo/gems-users
> > Use Google to search the GEMS Users mailing list by adding 
> "site:https://lists.cs.wisc.edu/archive/gems-users/"; to your search.
> > 
> 
> 
> _______________________________________________
> Gems-users mailing list
> Gems-users@xxxxxxxxxxx
> https://lists.cs.wisc.edu/mailman/listinfo/gems-users
> Use Google to search the GEMS Users mailing list by adding 
> "site:https://lists.cs.wisc.edu/archive/gems-users/"; to your search.
> 
[← Prev in Thread] Current Thread [Next in Thread→]