Quantcast
Channel: Mellanox Interconnect Community: Message List
Browsing all 6148 articles
Browse latest View live

Re: Mellanox (old Voltaire) ISR9024D-M recover flash area

Yes, 1.0.5 is the most up2date firmware this product ever had.as for documents, as a service to this dear group i packaged a bunch of documents for you guys. keep in mind - this product was EOL'd long...

View Article


Image may be NSFW.
Clik here to view.

Re: sx1036 as IB-to IP gateway

Hi Ingvar,The maximum ports you can achieve when using split options is 64 (with a mix of 40GbE and 10GbE).If it is a gateway, you need allocate some of the ports to run as FDR, while others to run as...

View Article


Re: Windows to Linux IB connections

there shouldn't be any problem with ipoib (CM or UD) between Windows and Linux.when it comes to native RDMA applications, i can't see any reason why this should work. on the other hand, i haven't see...

View Article

Re: Mellanox ofed 2.1 Centos 6.x system-config-firewall-tui hangs

I am not familiar with this issue but I looked around and i can see a bugzilla entree that has a solution on this: https://bugzilla.redhat.com/show_bug.cgi?id=485903 From what i am reading around, it...

View Article

Connectx-3 adapter - Dell 8024F - RDMA/RoCE/SMB3.

I have a few questions about the connectx-3 Mez adapter and the PCI card.  Assumptions, we are building a HV cluster with 2012R2 Scale out File Server and want to leverage SMB3/RDMA/RoCE.  1) is the...

View Article


Re: Firmware for Mellanox cards in Intel S2600JFQ

Hi Sergey, It is a bit odd that the firmware for the HCA PSID INCX-3I358C10551 was bumped up to 2.30.8000 and the FW for “501” wasn’t. If you have a support contract with Mellanox, then you should put...

View Article

Re: Simultanous TX/RX does not achieve 80Gbps (ConnectX-3 )?

Hi yairi, I tried adding multiple thread and here is the results: 1. Two NICs (1 connextion):  - iperf 1 thread: 23Gbps-iperf 2 thread: 39 Gbps-iperf 4 thread: 39.6 Gbps-iperf 8 thread: 39.6 Gbps-iperf...

View Article

Re: How to configure Mellanox card to accept breakout cable connection...

Thanks

View Article


create_ibv_flow() Create of QP flow ID failed

Dear Mellanox Support, I've problems with OFED 2.2. Neither raw_ethernet_bw / raw_ethernet_lat nor libvma can create a QP flow / call verbs.At some point on that machine it used to run fine but then we...

View Article


Re: create_ibv_flow() Create of QP flow ID failed

Basically high_rate_steer=1 parameters breaks OFED, that was the reason

View Article

Mixing OFED 1.5.3 and 2.2 in the same network?

Hi all, We have 2 clusters using infiniband, as follows:Computing cluster 1:27 nodesIBM bladecenterCentOS 6.2Each node has 1x MT26428 [ConnectX VPI PCIe 2.0 5GT/s - IB QDR / 10GigE]Firmware version:...

View Article

Image may be NSFW.
Clik here to view.

Re: Firmware for Mellanox cards in Intel S2600JFQ

Hi Branko, Thanks for the answer. I think I only got the support contract for the switch from mellanox. The cards were provided as part of the intel system, so I don't think I have specific mellanox...

View Article

Performance of 40GbE NICs

Hi,I have two machines, each with 4 40GbE NICs: # lspci | grep -i mellanox02:00.0 Ethernet controller: Mellanox Technologies MT27500 Family [ConnectX-3]07:00.0 Ethernet controller: Mellanox...

View Article


IPoIB iscsi disconnects. CQE retry exceeded

Hi guys, The new environment we are running is having some issues, the following is the architecture. Compute - XenServer 6.2 (HP DL360 G7s)Storage - OmniOS Comstar TargetVoltaire 4036...

View Article

Re: IPoIB iscsi disconnects. CQE retry exceeded

If you don't have any limitations on what which driver you can use - try to install latest MLNX OFED (see link below). There have been number of significant improvements between 1.5,x and 2.x OFED...

View Article


Re: Performance of 40GbE NICs

I'd suggest to check your system settings against recommended settings for BIOS (CPU power management states, PCIe bus),  interrupt moderation, NUMA tuning according to guide...

View Article

Image may be NSFW.
Clik here to view.

Re: Simultanous TX/RX does not achieve 80Gbps (ConnectX-3 )?

The theoretical max numbers in your case will be limited with following:     1. PCIe limit: PCIe 3.0 is 8 GT/s per lane but effectively it delivers about 1.54% less data due to 128/130 encoding, so...

View Article


Re: controllers Mellanox MHRH19-XTC, Windows Server 2012R2

Question unchecked because urgently needed to raise the cluster. Again had to return to Windows Server 2008R2.

View Article

Re: controllers Mellanox MHRH19-XTC, Windows Server 2012R2

But here, too, is not all right, I can not configure the card for maximum performance, lack experience in this business I'm a newbie.Would be very happy if anyone can help, it is unclear what drivers...

View Article

Re: controllers Mellanox MHRH19-XTC, Windows Server 2012R2

When you run the test MPI Ping Pong: latency getting the following warning:MPI Ping-Pong: LatencyTest Result: SuccessFailed nodes list...

View Article
Browsing all 6148 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>