Bryan and Justin always crank out great looking stuff, so the systems guys try to keep up.
Red and orange run out to storage switches.
Black is the dark, private interconnects between servers.
Blue is out to the public like water flowing.
White is for remote ALOM and goes to the console servers, like a the archetypal knight in white shining armor.
Green which don’t see is used for the interconnecting and meshing the switches themselves. What’s green mean to you?
3 responses to “What sysadmins start doing when hanging around designers”
I like the color-coding. The ethernet in our data center is all uninformative blues and yellows. And green…green could be money, or it could be envy. Not too sure about the applicability of the latter, but the former must be involved somewhere. This is about sweet, sweet hardware, after all.
Just curious – for your private interconnects (I assume these are RFC 1918 addresses), do you refer to other servers by their public DNS names, private DNS names, or their IP addresses? (I suppose another way would be split-horizon DNS, with names resolving to different IPs depending on who’s asking.)
I’m curious because we tend to give servers private names that map to the private IPs, but that has over time lead to confusion and occasional out-of-sync-ness.
It’s all in DNS with an ldap backend and they’re referred to by their private DNS names.
Also the IP space is functionally split so you can tell what and where from looking at the numbers.