Hello, I'm in the process of planning a new data room + IT office(s) and I'm trying to get as much information from the community as I can. I find out great post on this same subject found at and it's been really helpful but I still have a few questions of my own.
I have two small data rooms right now. One room has one server reack with two equipment ladders (I believe that is the correct term), one on each side of the rack. The other data room is much smaller (more like a closet) and has only a single server rack (which holds all the necessary equipment for the whole office).
By the end of next year we'll be in a new building that will replace the two we currently have. The new data room is being built to support three server racks and two equipment ladders. This effectively doubles my current capacity. I know that we will be growing in the future but I'm not really sure that we'll be growing so much that I'll need space for a fourth rack.
My questions are: 1. Cooling: Currently we have a single unit that keeps the room plenty cool. However, this unit will probably not be able to support the extra area of the new room as well as the extra equipment so I'll need to get something better. Some ideas I had to combat getting a big unit is to install an exhaust fan controlled by a thermostat. The exhaust fan would run anytime the A/C was not running to (hopefully) remove the heat that builds up slowly and replcae it will some cooler air from the hallway. I also had the idea of using a heat exchanger (if that's what they're called) to utilize the cold outside air on appropriate.
I thought that this unit could run anytime the outside air is at least 5degF cooler outside than inside. Raised floors: I don't plan to raise the floor of the entire place but I thought it might be a good idea to put a small platform behind the server racks (2'-4' in height) so that I could run any power cables underneath it. I imagine the platform to be simple and probably built out of wood. Any thoughts on this?
Electrical outlets: The data room in our main building has electrical outlet boxes that are attached to the wall. They protrude about 2' from the wall. I have my UPS power cords plugged into these and because of the nature of the electrical cord I've got another 2' of protrusions from the wall. That gives me about 4' of protrusions from the wall leaving lots of chance for a cable to be pulled out accidentally or become weakened over time from being bumped.
Any ideas on how to solve this? Is it possible to have recessed electrical outlets? Cabling: We'll be rewiring the new building entirely because we'll be rebuilding the interior from scratch. I think that CAT 5e will be good enough for our needs but I'm curious if there's any reason(s) that I should consider opting for the more expensive CAT 6? Grounding: I read about grounding in the thread that I linked to earlier. It's something that I never considered and I honestly don't know much about electricity.
I would guess that any electrician we would work with would know how to properly ground the equipment racks in the data room, but just because.I. don't know about it, what point(s) should I be of aware so that I can be alerted to an electrician who may not know what they're doing? Lighting: Other than the standard ceiling mounted flourescent light(s), I was planning on putting a light behind the equipment racks to give myself some light back there (which even in my small data room is a problem). Any tips on this area? If anyone has any pictures to share I'd greatly appreciate them! Thanks for your time! My company did this recently in the past few years was a blast planning out the new rooms and the 'interface' with the rest of the offices.
Dec 27, 2017 - DOWNLAOD LINK ===> From ContextMagic: ClickYes Pro is a tuning tool for Microsoft Outlook security settings. It allows configuring which. All in all, ClickYes Pro offers a straightforward software solution for helping you configure security options for. ClickYes Pro 2010 3.5.9.0 + crack keygen/serial. Dec 24, 2017 - Clickyes pro 2010 serial key. User reviews for ClickYes Pro Trial I wrote a nice app to send highly personalized email from my database to my. Dec 12, 2017 - What is the product key for ms office 2007 i want its pro duct key. Are u using a trialware product?? If so u can download and install 2010 MS. Jul 22, 2017 - Clickyes Pro 2010 Keygen >> Clickyes Pro 2010 Keygen e1977f8242 website x5 evolution 9 download crack download. Clickyes pro 2010 license key.
We got a portable cooler from APC and that takes care of our server room but I do believe you are looking at a larger space then we have, but try thier site for the configuration tools there that should give you a good idea of what you are needing/can afford. Electrical outlets, we had the specialty outlets (make sure they install these during the build out) installed very close to where the ups's are sitting so while the plugs stick out the sockets themselves are placed out of harm's way or at least harm's usual path of destruction. Besides the server room make sure your offices themselves have way more outlets then you ever expect to use really can't count how many times I've gone to plug something in, in my office and have to hunt for an outlet. For cabling we bit the bullet and put in CAT6, way cheeper to put in the first time then redo it later down the road. Not to mention we got to cross that one off the secret IT wishlist, well most of it's secret just picking our battles. Depends on how long the company plans to be in this building as the deciding factor really.
Also you will want to consider CAT6 if you are looking at VOIP and heavy video conferencing within your company in the future. One thing we did not do in all of our giddy excitement was plan for power outages. Usually it doesn't go out for more then an hr so really no big deal we have a fine set of ups's to get us through that. Not sure where your are located but, hurricane Ike very kindly reminded us of our error and we spent a little more then a week with no electricity so make sure you get that part covered no matter how you intend to handle it. Chris, If your UPS's require the higher voltage plugs like you would find on a washer/dryer or other large appliance that can't use a regular outlet. One of ours has some huge round monstrosity that wouldn't even fit in the 'normal' high voltage outlets, was a pain to have someone out to install that in what served as our server room in the old building, electricity had to be cut off to the room durring business hours you can just imagine how well that went over with the general populace so we made sure it was put in during the build out for the new office.
Jan 13, 2018 - This is a gripping love story set to Devon, England in the beginning of the 20th century. (VHS quality, english language, no subs, 95 min.). Play or download Summer in the country 1980 porn video, 3gp xxx porn, mp4 porn, sex 3gp. 4 min mother fight son for sex. Vintage movie scene. Summer in the country 1980 full movie download.
The power outages here we weren't too worried about as well. But things happen and I promise it's much easier to make sure you have the generator or at least a plan for it in place before hand. As it is we just plan to rent 2 and bring them up here if we are in that position again (one for the servers/spot cooler the other for the systems). Oh and I edited the section on CAT6 in my previous post may apply to you since your company is looking at expanding. Hi Chris, I saw your direct message and thought I'd reply here. First, regarding my Visio diagram, I believe the Visio shapes I used are included in the full version of Visio (pro?) which allow for server rack planning. (By the way, at least in the US, ladder is the term for stiff cable trays.
Racks are either two-post or four-post, and called server racks or telco racks. I highly recommend APC's website for planning rack utilization and selecting the correct racks for your equipment.) The rest of the shapes are 'wall' shapes from the office plan part of Visio.
Even if you don't have these shapes, appropriately-shaped rectangles and lines work just fine. Major vendors like 3com, APC, and Cisco have Visio shape packs available for download as well, though they're not really necessary. I'm just a very visual person, so the act of seeing my server room before I build it is very valuable to me. You might not be the same. My plan is nearly identical to its current configuration a year later, except: - I added 2U for IP-KVM space at the top of the rack, removed the bottom cable organizer and switch, and added a lot of servers in the bottom half. My biggest mistake in planning is that I planned top-down.
Racks should always be planned bottom-up for stability and accessibility. Leave extra space in between groups of objects (like on top and bottom of my block of switches) for expansion and changes. It sucks to remove 40 screws to add one item to the top. I highly recommend quick release thumb screws from rackrelease.com (not regular thumb screws- these have tiny posts which stay in the rack and easy-turn knobs to tighten or loosen. Much easier than regular screws.) - I didn't put the PBX on the plan (on the backboard), so it's a little tight behind the rack. There is no second backboard on the side, rather the one backboard extends from 0 to 8 feet from the ground. This simplifies routing.
Although my plan dictated A/C ducts, vents, and thermostats specifically, building construction didn't follow the plan and instead installed an in-wall A/C unit in the middle of the wall, creating airflow problems and wasting wall space. They also put the office A/C on the same circuit as the women's bathroom, so either we're freezing or they're sweating. My plan called for a single light in the middle of the rooms, but they installed two lights 1/4 and 3/4 of the way across the ceilings. This works much better than my plan as you can see below in my answer to your lighting question.
Light switch for the server room is located outside of the server room. We've got to either walk or ask someone else to turn it on sometimes, not quite convenient.
I didn't plan for shelves or storage space inside the server room, but we later installed shelving above the a/c unit on the back wall to increase storage capacity. (By the way, all our shelves are crammed and we routinely give away excess equipment. Storage space is paramount, get all you need and then plan 50% more. Our main door actually opens outward, and we had space on the south wall for more floor-to-ceiling shelves, which are packed. We've got filing cabinets under our desks and used to have two cabinets in the server room until we wikified all our data.
We never installed the fourth rack (top-right) since all our equipment fits on the two 2-posts and one 4-post. I've got a spare 2-post sitting in the corner to bolt to the floor if we ever need it (but I doubt we will.) - Didn't realize my new 2200VA UPS took a higher amperage than normal, and thus uses a different 3-prong plug, so we had to install an additional 30amp circuit to the server room. Same deal with the APC network-managed PDU- wasn't designed to be plugged into that UPS so we ended up getting an adapter from Ace Hardware. I highly recommend a CDW or APC advisor for help with this because their sites don't give much info on this type of stuff. Drilling the holes in the floor for the racks was quite hard. The floors are reinforced conrete with rebar, and apparently drills don't like that.
Drilling straight and with the correct spacing was a chore, and be careful with the bolts- once you tighten them, there's no going back. I recommend hiring a professional. The sliding glass door is a neat idea, but standard 2-pane patio doors still lets through the hum of the servers. I'm sensitive to noise, so sometimes I wish I'd put a nice thick door there instead, although it is nice to be able to see inside of the server room real quick. If noise is important to you, I'd advise testing doors yourself and/or get a standard thick door with maybe a window and lots of weather stripping. Noise insulation is a great idea and we've got none of it. The person sitting at bottom-right tends to back up into the person sitting at the bottom, and is boxed in by the legs of the desks.
A diagonal corner desk might have been better. We only had 4 employees for a couple months. The left-side desk is adjustable height and we've got a tall task chair, which is useful if you want to convert it to a workbench. Then again desks work fine too.
Visio doesn't have specific features for wire length that I know of. For the current switch configuration you see in my diagram, I'm using 2U Neat-Patch cable organizers, which have a pocket in them to hold excess cable, and the 24' cables Neat Patch recommends (and supplies.) Attached are photos of my (messy!) server room and office so you can see in detail the plan and the reality.
I didn't post the Visio plan of the Server Racks or the Telephone wiring in the earlier thread ( ) but you can see that the server racks and telecom wiring has a lot of stuff that needed planning for (such as wire lengths, logical organization and routing, heat, placement, weight, and accessibility.) - If you don't have a wire routing pattern that's both inclusive enough for all situations but strict enough to standardize paths, you'll end up with a rat's nest. For example my telephone wiring has PBX stations on top, premise wiring on bottom. Each port and jack is labeled, and there's extensive wiki documentation and diagrams for this. Notice how there are no stray wires on the bottom half of my phone wiring panel, except for the two surface-mount jacks that I added hastily. One part to notice is the network PDU on the side, the two huge UPSes on the bottom, and the KVMs which are behind the Cisco PIX and go to each server. There is a long KVM cable going from each KVM to the master KVM on the telco rack, which in turn goes to the tiny IP KVM unit at the very top and the monitor/keyboard/mouse.
Also notice the brown boxes of unused Panduit vertical cable organizers. Turned out I didn't use them, partly because 4-post racks aren't friendly to them, and partly because there were less cables than I imagined, but then again it's a rat's nest back there so I probably should use some sort of cable management. One final note before I move on to your questions in this thread, I'm not too keen on my idea of a dedicated A/C unit and all the wire management you see here (especially vertical). If I redesigned this whole thing, I'd probably look into a self-contained 4-post enclosure like they have at APC, which does its own cooling, power distribution, and wire management. For example, you might have a UPS, but what happens when someone flips the breaker on the A/C unit at 2 am? It's happened and it's not fun. Supposedly it's also more efficient to use self-contained rack cooling than large A/C units anyway. As always, just message me and I'll be happy to help.
Cooling: Constant active thermal management is necessary for the number of servers I have. I'd imagine you have a similar number of servers, but even half as many would still generate enough heat that I'd recommend a dedicated cooling unit. If that's out of your budget, though, talk to APC, CDW, or go with your gut.
Just do a lot of planning and testing and researching. Raised floors: No need. Either run them overhead on a ladder rack as I've done, or tape the power cord to the floor/wall. With UPSes you should only have one or two power cords anyway, and with proper planning you shouldn't be cramped for space behind the racks. Electrical outlets: the 30 amp plugs I have on my UPSes lock into the socket. (See pic here: ) If you can't do this, you can get plastic outlet covers that extend over the plug that should prevent them from getting knocked loose.
(They're UPSes anyway, right? Don't you have an email alert set on them?) 4. Cabling: Go with CAT6. It really shouldn't be more expensive, and the instant an executive or marketing person wants faster ethernet you'll wish you'd done it.
I know I had. By the way, go with professionals for the wiring job. I chose Applied Business Communications here in Arizona (they've done Symantec data centers) and I can't praise their job enough.
Prior to their work, troubleshooting in-wall wiring was a normal thing. Now it's out of the question. They work, end of story.
HUGE time saver. Worth every penny. Grounding: Honestly, I get the reasoning behind grounding, and one of these days I'll probably get around to it, but my racks are bolted into concrete which is sitting on dirt.
If your racks were bolted into wood on the second story of a glass building, sure grounding might be an issue. But even then the chances of a problem with an improperly grounded rack (other than static electricity?) are much less likely than the chances of a problem with cheap equipment elsewhere. (Notice I didn't say more severe, since apparently improper grounding is a cause of death.) IMO, don't spend thousands of dollars on it unless you think it's necessary. Lighting: I ended up placing my flourescent lights directly above the racks, and the walls are white, so visibility isn't a problem.
Those lights with hooks and a long cord used in the auto industry might work well for your situation, or just a small flourescent light mounted somewhere behind the rack. Without seeing the situation I can't really advise.
Will has a great point here. No matter how well you think you plan, you will find something that you forgot or that needs to get moved around. I think the biggest impact you will face is what you do not know you will need yet. In other words - plan for expansion.
I reworked a server room at a remote office this year. Quite frankly, I was surprised that we got everything that we did.
The building was rewired, phone system moved into new data racks, and I thought we left plenty of room for expansion. In fact, I caught some flack about how much space was left over. Everything looked picture perfect.
We had our cabling guy do all of the work, and then I went through and tested everything with him remotely. (Figure on a few hiccups - like that phone extension that was still on it's own POTS line and was directly wired. It's also amazing how many extra phone lines and such that you uncover - they have been being blindly paid for for years.) One great recomendation that our cabling guy made was to use extra long patch cables. The data racks had verticle and horizontal cable management, and you could put the extra patch cables into the verticle space. Since the install, somw phones were moved around by the onsite guy who is sort of techie and does what we need (our hands if you will). Things got a little messy, and we had to get on his case about using the management properly.
In addition, we added a bunch of stuff we never saw coming. Wireless access points across the building, fiber over to the warehouse next door for the wireless points there, new security camer systems, and an additional digital phone card which got punched down to a CAT-5 panel. So all that extra space we had - gone! I could even see another full 42 inch data rack being added in the future.
One other thing we did this year was standardize on the color scheme we used for wiring. This way when you are looking at a wire, you immediately know if it is phone or data, or a special cable. You can use your own system, but make sure it is in a common place, and have extra data and phone patch cables around so that people use the system. My cabling guy was giving me a hard time about it looking like a Christmas Tree until he saw me remotely diagnose a problem with a receptionist that could barely turn on a computer.
He uses the exact same color scheme now for all of his clients. Will seems to have most things covered. Think about air flow and which direction things will need to go. Is one server rack full of servers exhausting into the area of another. Will the airconditioning system pull the heat to the other racks. Work with your HVAC guy to plan air returns - etc.
Get a quality electrician for the wiring. You probably want a sub panel in the server room if you can get it. Not only does it make turning off the power without finding the VP's space heater got added to your server rack easier, but it makes adding on easier in the future, Make sure you go over all of your APC plug types and power requirements with the electrician. If you are going to be expanding, now is a great time to upgrade your APC's to allow for expansion.
Look at the draw on your current APC's and calculate run time. Do you have remote management cards in your APC's (WELL WORTH THE MONEY). Get a real server rack with cable management. Not only will it make the setup cleaner, it will allow for proper airflow. You might even consider one with airconditioning or an exhaust system to duct the heat out of the room all together. It is amazing the difference adding one or two new servers will make to heat levels.
Like Will said - plan the racks form bottom to top. Put the weight in the bottom.
I would bring the cabling in over the top with wire ways (from the ceiling) or ladder racks (horizontal). As for the power under the floor, unless you are going to do the whole server area with a real floor, I would skip this. You also have height restrictions to consider here. Work with the electrician to place the plugs for the APC's in non-foot traffic areas. Hiding the plug in the floor makes it non accessable if you should ever have a fire. If you need to get to the plug in a hurry - you are screwed. CAT-6 - you will probably never buy full gig switches, and then you would need to switch everything to 10 GB on the server up side anyway.
Use the CAT-5 for phones and data (like extension cords) Don't get cute splicing wires for phones or any crap like that. Every jack end to end should be CAT-5 and be able to be used for both. Terminate to panels in the data rack and use cross connects to a swicth in the server rack that is a quality gigabit switch for all ports. I would grab a Cisco switch for the server rack. Consider using fiber or CAT-6 for cross connects to remote data closets. Everything does not need to be a home run to the server room.
Consider a 25 pair to patch panels for cross connects for digital phones in the remote racks. This way you have mabe a pull of several fiber strands or a few CAT-6 and a 25 pair to the closet. (Yes - build in redundancy) When you are sitting there with a bad cross connect because someone cut a cable or someone put a screw through your cable, you will look stupid when after all this money you can't just move to a secondary circuit. I'll think of other stuff and post back. Here is a list of patch cable colors. It may seem crazy, but it is handy when you can say - follow the RED wire and see if the light is on to someone that has no clue, for example.
Server NIC Cards PURPLE CAT-6 Server NIC, Switch Uplink Management Cards GREEN CAT-5e APC, DRAC, Other Devices Data Patch Cables YELLOW CAT-5e Patch Cables, Workstations Phone Patch Cables BLUE CAT-5e Patch Cross Connects Internet Uplink RED CAT-5e Cable, FIOS, T1, Etc Wireless BLACK CAT-5e Access Points Video GRAY CAT-5e DVR, Cameras. 1) Cooling - there is a nice cooling planner on Dell's site: this is primarily for dell equipment but if you add a USER DEFINED DEVICE you can manually type in the U height and how much power the equipment consumes. Once you have all the equipment put in the rack it will tell you how many Watt's, Amps and Volts your equipment uses and then will tell you how many tons of cooling you will need to cool the room.
My office was having issues with another site being too hot in the server room. We had a portable 3/4 ton AC unit and found that the room needed 2.5 tons of cooling to keep it cool. Now that we have upgraded the cooling system to a 3 ton external unit the room has been fine since. 4) CAT6 - we rebuilt a new office building from scratch last year and found no reason to go with CAT6.
In that building we have around 75 users, 4 servers and are running a VoIP phone system for 3 locations from that building and have had no issues at all. Though this will depend on what your company does.
If you are a video editing company with high end desktops then I would definately go with the CAT6 and Gigabit switches. This site I was speaking of mainly uses thin clients to connect to terminal servers so the load is not heavy at any time.
All the processing is done on the servers. Brian has some great points here (not to pat each other on the back too much) but I want to add: - UPS run time sucks. For the money, you're looking at 5-15 minutes of runtime. Long enough to shut down, not long enough to 'keep running.' Look at average power outages and durations during the past 5 years and ask management if it's worth $x for an appropriately-sized automatic-on generator to keep running during those times. My boss said yes, and we're not a big company. I recommend CAT6 unless there's more than a 20% price difference.
It's higher quality cable and if your boss decides he wants to run a 10GT PST file over the network with no lag, guess what, you can buy a Gigabit switch but CAT5 isn't gonna hold the fast signal. Consider the cost of ripping out the wiring and/or adding more wire yourself later- with online bulk CAT6 prices so low it's worth it IMO.
Put at least 2 CAT5/6 for each data location, and 1 CAT5/6 for each phone location. (Phone only uses 1 pair out of 4, so although it's not pretty, it is easy to add unforeseen phone lines.) Running wire is cheap! The expensive part is the JOB itself and the LOCATION to drop to. This makes planning for expansion easy- 4 wires costs basically the same as 2. Brian has the right idea with 10/100 switches for clients and gigabit backbone switches. Backbone and servers should have 10x the speed of clients unless usage is REALLY sparse.
If Cisco equipment breaks your budget, look at 3com. If you double up on your switch cross-connects (like redundant lines to an intermediary wiring closet) make sure you appropriately configure link aggregation (bonding.) Did you know you can destroy a network by plugging two switches into themselves twice? True story, ruined my weekend. Insist on 'home runs' for all wires (i.e. No switches/splitters under desks) and only use an intermediary homerun when necessary for wire length or balance issues.
Cross-connects add substantially to overhead when making changes to wiring, they're not cute, minimize complexity as much as possible. If I have a phone with extension 241, plugged into a jack named IV3 which goes to 25pairA #19, which goes to PBX Port 03.16, you're talking about five separate wires to document and troubleshoot. Minimize this as much as possible, but without locking yourself into a situation where a 1:1 relationship between jacks and 25pairs prevents you from adding more than 25 ports in a given area despite only having 12 active lines in that area. An easy way to do this is to eliminate the 25pairs needed for voice intermediary closets.
I don't recommend CAT5 patch cables for phones. Patch panels and cables are expensive, you're wasting 3 twisted pairs per jack, panels and cable management take up tons of space, and length is hard to change so you quickly end up with a rat's nest. Get a nice punchdown tool and punch your phones down to 110-blocks, unless that's too confusing for whoever will be troubleshooting the wiring, or you're REALLY anal about pretty wiring and willing to waste $1000 on it.
Color-coding is fine, except when it locks you in as I mentioned above. The instant someone connects up a server with a blue wire and a phone with a purple wire, your system is screwed. I prefer clean & clear labeling & documentation of panels & ports, over trying to label the wires themselves. Chris, a lot of this is hard to explain/predict in advance, so you might just plan all this out in Visio or by hand and share your diagrams for peer review. We don't know what you do and don't know or have experience with, so rather than droning on about power outlets in the floors we could be discussing the finer points of routing topologies instead. @Will Bradley '2. Raised floors: No need.
Either run them overhead on a ladder rack as I've done, or tape the power cord to the floor/wall. With UPSes you should only have one or two power cords anyway, and with proper planning you shouldn't be cramped for space behind the racks.' Is it OK to run electrical wires next to data wires? I guess that has to do with the shielding on the data wires but.
I'm planning to put 3' between the back of a rack and the wall. Right now I've got only 2' and it's a bit tight. A question and some more background. What is a 'cross connect'?
Right now my network topology is simple because that's all I've needed. T1 Router (passthrough mode) Sonicwall TZ190 (NAT, VPN, Firewall) 2 x Dell PowerConnect 3024.
From there I go either into a server or a patch panel. Each office has a few electrical outlets and data outlets. A data outlet has three ports (white, grey, black). White for phone (although I seem to be the only person to remember this). Grey is miscellaneous (phone/data). Black is for data. In my office and the marketing office I'll probably double up on the number of electrical and data outlets because we seem to run out.
Starlito cold turkey download hulkshare. Starlito my homies starlito s way. Click here to get file. / / Starlito cold turkey mixtape zip Starlito cold turkey mixtape zip download Starlito cold turkey album download zip. Starlito turns himself in after nashville shooting.
On my patch panels I use color but I haven't been very good at keeping that up so I have a semi-organized 'rat's nest'. I'll be fixing this in the next incarnation.:) One reason I haven't been great about keeping my patch cable colors consistent is that I ran out of the appropriate color and had to use another kind (for whatever it was that I was doing). I thought it might be a better idea to buy rolls of cable and make my own patch cables instead of buying a box of n-quantity of n-length.
I will also be labeling my cables more consistently. My goal is to 'fix' all the disorganization I have now and setup a system(s) to help keep things organized.
Microsoft Visio Stencil Download
Better cable management (especially in the server rack), better labeling method, and better patch cable system. Nice layout Chris. If the purpose of the raised 3 inch platform is to provide a pathway for power cables, then your drawing works. Personally I would move the platform to the left side and draw power from the left and run that along the bottom of the racks to feed the ups's.
If we can imagine that your drawing is oriented like a map with north pointing toward the top of the page, then I would run a ladder rail east-west over the racks and extend the rail to the left wall. So data cables overhead and power cables underfoot. I would consider moving the data drop away from an exterior wall to the upper right hand corner if its easier to pull cables. But again, your drawing is fine as is. And here is a picture of my current room. Somethings you can't see:.
To the right of the rack is a small desk with a keyboard, monitor, and mouse. I've got two KVMs from Belkin. But I dislike these very much because they are not accessible over the internet.
In our other building I got an IP KVM from Raritan that I'll be using. I'm not totally thrilled with it so I'd be glad to hear suggestions. It is definitely better than the Belkin KVM though. There is a wall mount A/C to the right. It is directly above the desk.
Free Visio Stencil Download
It seems to work well enough but I think it will be too small for the new room. This might be a silly question but unless I'm wrong, battery back ups serve two purposes: 1. Power delivery regulation and 2. Uptime for equipment.
I'm thinking that if the battery backups I've got only support the equipment I have for a few minutes AND the 'powers that be' decide we don't need to shell out for 30+ minutes of battery/generator time, is there any reason to have a real UPS at all? Is it sensible to simply have some kind of power regulator? If the power goes out and I'm sent an alert via email the servers will probably already be off by the time I can remote in and shut them down properly. Is modern hardware susceptible to damage if the power is cut to a server without warning? Cross Connect Question: If you look at Will's Left-Side Wall picture, on the right side wall, you will see the row of phone system equipment (black), then below it is the green Cisco router (probably a 1700 or 1800 series), and then you will see the bank of white punch down blocks. The gray wires are coming from another location and are terminated to the white 110 punch down blocks. These are generally intermediate punch down blocks (or cross connects).
In the Axxess setup's we have done, we come directly from the amphanol connectors (these are the black connectors on the phone system that turn into gray cables) directly to CAT-5 patch panels. This makes it easier to test the phone connections, and it also makes the phones routable by just changing patch cables. In this case Will's setup either goes from the intermediate punch down blocks to panels in the data rack, or each of the jacks are hard wired to a jack in an office. The down side to hard wiring is that you have to program an extension that corresponds to the port on the digital phone card.
If someone moves across the building, most phone systems will require you to change the extension and clear their voicemail. By having the patch panels, I can move the phone without the user knowing the difference. All jacks from office terminate to patch panels.
All connections are CAT-5 end to end so that the jack in the office in effect becomes an extension cord. If the office jacks terminate to a closet and then you use cross connects from the server room to the closet, you can route the phone across a 25-pair or if you have CAT-5 crossing on a one to one for the panels, you can just use a regular set of cables.
Then you are just using patch cables to connect from panel to panel. APC's - YES - Always have APC's with a shutdown. If they do not want to shell out for for the APC's now, they can shell out for servers at some point. Our offices were a bit more of a mess yesterday, but since everyone left early today I got a bit of time to clean up. First row pics: 'my' end of the IT office (our third man left shortly after we moved in) yes that's a rack server on the desktop.email server blew up at Thanksgiving and yes I was thankful I get a new shiney to play with. Second row: the I can't do it over the network desk (hardware repairs, reimaging ect) Third row: our server room fairly tiny but the bakers rack will go as the older servers die Fourth row: IT Director's office, ie the file room.
@Scott Howard The data room, as I think the owner is planning it, will probably be centrally located in the office space. It will not be against any exterior walls. Based on your comments I made the platform even longer! I want it to span the entire space back there so that I'm not tripping over it. I imagine it will not be bolted down so if it's too small it's going to slide around. I did add a ladder as you suggested. Visio doesn't have that kind of object so I just made my own. And I didn't put all the rungs on it because, well, it was taking too long.
Do you think another ladder should be run from the left wall up to the data drop? Chris said 'Do you think another ladder should be run from the left wall up to the data drop?' Yes OR Slide your racks almost all the way to the left wall.
Apc Rack Visio Stencils
Move the data drop south-ish so it lines up with your row of racks. The cables come down from the ceiling (or up from the floor) directly inside the left most rack and terminate on equipment mounted in that rack.
Also on the lower left hand corner of the room, you might want racks or shelves that you can setup for multiple pcs in case you want to rebuild a pc, setup a test server etc. Chris: - Not OK to run electrical wires (especially unfiltered by a UPS or high-quality line filter) within 1 foot of data wires. I plan them out to run on opposite sides of the racks.
This helps when your servers are identical so the PSUs are always on the same sides. There is no shielding on normal CAT5 lines.See my Visio server room plan, it turned out just fine.
Only tight when two people are trying to work back there. Your Visio plan: I don't get the 3' platform's purpose. Minimize obstructions as much as possible. Consider centralizing your rear power outlets and running cords overhead on ladder rack, maybe even raise the outlets to ceiling height. Which direction are the servers & equipment facing? Keep in mind that switches have their connections on the front (power on back) while servers have all their connections on back. 4-post racks have difficulty with standard wire management channels, plan this very carefully. Try to locate your data drop closer to the racks, or run some ladder rack from the wall to the patch panels on the racks.
Do you have any wall mounted devices? If so, get a fire-rated backboard (they're sold online, don't just get plywood.) Is noise an issue? You're seated awfully close to a ton of equipment, I would need earplugs if I were you. Storage is probably insufficient, consider moving your whiteboard.
Can your door open outwards? Also, with new advances in virtualization and blade server systems, do you really need five racks? Consider all paradigms here- if I redesigned my system I'd probably go with a blade and a SAN and thus save hugely on heat, power, noise, remote management, and form factor issues.
Responding to Brian: 110 cross connects are, from top to bottom: Premise wiring (MV01 - MV048); 50pair cross-connect to IDF; Phone system cards 1-15. Do you mean my Dell PowerEdge 2600? Former 'does everything' server from old system. All those old towers are now used for testing.
A couple Iomega NAS for archive files, but I hate them. My IBM servers, on the other hand, are rock stars.
I highly recommend SATA/SAS drives, SCSI is dead, we're down to $100 per terabyte, and clients can't access files at over 100mbps anyway. I like my 3com switches. The new black gigabit one caused me some trouble and shook my confidence, but all the pro-grade superstacks are bulletproof.
We have exactly zero problems with physical infrastructure in 2 years (that weren't caused by a dumbass human.) No workstations used as servers except the noncritical ones like internal troubleshooting/testing/development stuff. As I said those big boxes do 1/10th the work of my IBM rack. Re: what's the setup? You gotta be more specific, I can type for a week and not fully answer that question;) Chris again: Cross connect is the technical term for a wire that is not used to connect device-to-jack, but rather jack-to-jack (maybe device-to-device too, but less accurate in that sense.) Research the TIA/EIA-568-B standards document, this will help you greatly in planning and organizing your wiring. It has such guidelines as bend radius, vertical-vs-horizontal wiring and connections, color coding, etc.
There are a couple summary documents like this that give you a great education on it and I highly recommend becoming an expert at it. Don't bother making your own patch cables unless you're a master at it (like me; ).
Fry's electronics and online vendors sell them for much less than your 15-minute salary. I just use all black, and label the JACKS not the WIRES. Then document in the form of Jack 14 Switch2 Port23. Again, Google Docs or DokuWiki is a godsend for me. Can't recommend NeatPatch cable managers enough. My KVMs are Avocent, and they work fine especially the newer ones. They sell IP KVMs of all shapes and sizes.
Switching is done by the PrintScreen button. Strongly consider rack-based cooling. It's much more efficient and flexible than wall-mount A/C and in some cases can end up saving money. APC makes and sells them. Your UPS uptime requirements are 'how long does it take to (a) determine that it's a big enough outage to start shutting down, and (b) shut down everything completely.' Might sound common sense but re-read it, it takes a lot of planning, analysis, and estimation.
I highly recommend network management cards for all UPSes connected to servers. USB doesn't cut it. Re Brian again: Our voice wiring is all 110-block, no patch panels.
As I said earlier, in my opinion using CAT5 patch cables for voice wiring is a waste unless someone is really uncomfortable with a punchdown tool. Axxess port 13.01 - 110block #13.01 - 110block #MV03 - office wall jack MV03. I absolutely do not advise hard wiring. Use cross-connects (the method me and Brian are describing) to allow for easy wiring changes. Jen: What's the black tubing for? Is that exhaust for a contained cooling cabinet? Re Chris again again: Seriously not feeling the platform. Have an electrician extend your power outlet/s to ceiling height and run a short ladder rack from the top wall to the racks.
Here's where planning your wire routes comes in handy. For example one of the servers on my telco rack has a custom serial cable that connects to my phone system on the wall. I had to plan the placement carefully to make sure the cord would reach.
Don't buy 20 feet of ladder rack if 4 feet would be more effective. Simplicity is key.
A reddit dedicated to the profession of Computer System Administration. Community members shall conduct themselves with professionalism. Do not expressly advertise your product.
More details on the may be found. For IT career related questions, please visit Please check out our, which includes lists of subreddits, webpages, books, and other articles of interest that every sysadmin should read! Checkout the Users are encouraged to contribute to and grow our Wiki. So you want to be a sysadmin? Official IRC Channel - #reddit-sysadmin on Official Discord -.