How to enable motherboard HDMI shouldn’t be something difficult. Virtually all motherboards come with an HDMI output port that you will find at the back. Since the HDMI port is located at the back, hardly can you have any issue with it. Nevertheless, some central processing units (CPU) don’t have integrated graphics, but if yours have a CPU and you can’t still see anything displaying on your screen, it means your HDMI port might be disabled. When such a thing occurs, what do you do? It is not the time to panic and feel agitated.
Irrespective of what you want to do with your HDMI port, whether for a primary or secondary display, the process remains the same. It doesn’t matter if your objective is to add a new display, even if your graphics card allows a single HDMI port.
You don’t have to buy an adapter, especially if you don’t have the money to spend on that. To help you with how to enable motherboard HDMI, here are some steps you can follow to get your HDMI port working.
How to check your CPU for Integrated Graphics
Peradventure you have checked and your motherboard comes with an HDMI, the next thing to do is check if the CPU is configured to use that port. The idea behind this that without integrated graphics, you can’t use that port since there is nothing supplying power to it. Assuming you know the CPU model, you can check your manufacturer’s website to find further information. If you don’t know how to find your CPU model, here is a simple step to check it in Windows 10.
- Click on Start and open settings. On the other hand, you can go to the control panel.
- Select System – on the left side, you will see “About” click on it.
- Here, you will see the processor model beneath your “Device Specifications.”
- Once you find the model, you can search online by using the model and processor manufacturer to gather more information.
However, if you are using an older version of Windows (Windows 7 and lower), you can use the Device Manager to check your CPU model. Here is how to do that.
- Click on start on your computer.
- Type “Device Manager” and open. Another option like that of the Windows 10 is to go through the Control Panel.
- On the Control Panel area, look for Display adapters.
- You will see the integrated or dedicated graphics card you use. At times, you can see both.
If you don’t see any integrated graphics card, don’t panic. Peradventure you are using a dedicated graphics card, the motherboard may have disabled the onboard graphics automatically. If this is your situation, here is the process to help you enable your port.
How to Enable Integrated Graphics in BIOS
Since you have discovered that your CPU has onboard graphics, you can enable it using the BIOS. Some motherboard manufacturers have inbuilt support to enable you to use both onboard and dedicated graphics simultaneously.
For instance, Asus has such features; if you are using an Asus motherboard, you will find something called “iGPU Multi-Monitor,” you should ensure it is enabled. However, other manufacturers use terms like such as “Enable IGPU,” integrated graphics, and enable integrated graphics.
If you are new to computer stuff and don’t understand how to open your BIOS, we will guide you. Ensure to follow the process listed here to avoid anything else happen.
- If your system is switched on, ensure to turn it off and save all necessary documents.
- Once the system is turned off, you can start the process. However, ensure your battery is full or on charge.
- Turn your system on; there will be a POST screen before your system finally boots up. Before it boots up, you need to enter the BIOS setting. To do that, you can use the following keys depending on the motherboard you are using. The standard keys to use are F1, F2, F8, F12, and Delete. You can check for the exact key on your manufacturer’s website.
- Once you enter the BIOS setting, look for the integrated graphics setting. Here, it would help if you were cautious because anything you modify will ultimately affect your computer. If you don’t see the integrated graphics setting on any of the menus, you can find it on the “Advanced Settings.” Here, look for anything like “Graphics Configuration” or anything similar to that.
- If you can find it, ensure you enable the Integrated Graphics, Multi-Monitor, or iGPU on the motherboard. Look for a description of the setting that states that allowing or enabling the setting will allow you to use discrete and integrated graphics for multiple displays.
- After you are done with the settings, ensure to save it and close the BIOS. Once completed, your computer will reboot.
If you enabled the integrated graphics, you could now connect your new display. However, if you are unfamiliar with computers, you can follow the steps below to connect the new display.
How to connect new Display
After checking if your CPU came with integrated graphics and enabled, the next step is to use the HDMI port. You would need an HDMI cable to link to your display. To do that, you will need an HDMI-to-HDMI cable. If you don’t have such cable, you can buy one from an electronic store for some buck. Alternatively, if your display doesn’t have an HDMI port, you can opt to use an adapter.
- Plug the HDMI cable into your display monitor.
- Peradventure you are using two monitors, you should plug the primary monitor into the graphics card. Then, plug the second monitor into the motherboard HDMI port. Your motherboard ports are located on the rear side of your system.
- Once you connect it, turn your computer on and enjoy your multi-monitor setup. If the resolution looks weird, you can twerk the resolution to what suits you.
How to enable motherboard HDMI port is very simple. In most situations, all you require to do is enable integrated graphics in the BIOS. Peradventure after doing that and it doesn’t work, you should consider updating the integrated graphics driver.
[su_note note_color=”#d7d7d7″ text_color=”#000000″ radius=”7″]As an Amazon Associate I earn from qualifying purchases. [/su_note]