Skip to Main Content

Hardware

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Errors on new SAN disks (fabric login port failure)

807557Apr 1 2006 — edited Apr 14 2006
I posted something to the storage forum yesterday morning right before
the switch over to the new format and I think my post was lost. I
apologize if it later reappears and there are two posts of the same
subject. In any case, here's my question again...

We ran into an odd problem getting our machine to see some of our SAN
disks. In the example below, we restricted it to a single disk. We
have an x4200 running Solaris 10x64 with Sun (QLogic) fiber cards
(SG-XPCI1FC-QLC) connected through a Sun-sold QLogic 5200 switch to a
couple of StorCase JBOD's with a bunch Seagate disks (ST3300007FC
Rev3). We have the sun drivers:
root 16: modinfo | grep FC
111 fffffffff04d2000  19918  58   1  fp (SunFC Port v20051108-1.68)
113 fffffffff04ee000  18770  61   1  fcp (SunFC FCP v20051108-1.93)
115 fffffffff0509000   9d80   -   1  fctl (SunFC Transport v20051108-1.50)
116 fffffffff0512000  c9c48 119   1  qlc (SunFC Qlogic FCA v20051013-2.08)
164 fffffffff06bb000   9670  59   1  fcip (SunFC FCIP v20051108-1.43)
165 fffffffffbbb9ff0   5610  62   1  fcsm (Sun FC SAN Management v20051108)
182 fffffffff0719000   4b90   -   1  zmod (RFC 1950 decompression routines)
And a relatively recent version of the qlc driver:
root 17: showrev -p | grep 119131
Patch: 119131-14 Obsoletes: 119087-05 Requires:  Incompatibles:  Packages: SUNWf
ctl, SUNWfcip, SUNWfcmdb, SUNWfcp, SUNWfcsm, SUNWqlc
We've zoned the two switches so that the x4200 can see the drives. We
noticed when when we turned on the zoning for the disks the following
messages appeared in the /var/adm/message for each disk:
Mar 31 20:41:45 pemsdc fctl: [ID 517869 kern.warning] WARNING: fp(0)::N_x Port w
ith D_ID=202d2, PWWN=21000014c34f774b reappeared in fabric
Mar 31 20:41:45 pemsdc qlc: [ID 308975 kern.warning] WARNING: qlc(0): login fabr
ic port failed D_ID=202d2h, error=4009h
Mar 31 20:41:45 pemsdc fp: [ID 517869 kern.info] NOTICE: fp(0): PLOGI to 202d2 f
ailed state=Packet Transport error, reason=No Connection
Mar 31 20:41:45 pemsdc fctl: [ID 517869 kern.warning] WARNING: fp(0)::PLOGI to 2
02d2 failed. state=e reason=5.
Mar 31 20:41:45 pemsdc scsi: [ID 243001 kern.warning] WARNING: /pci@0,0/pci1022,
7450@2/pci1077,132@1/fp@0,0 (fcp0):
Mar 31 20:41:45 pemsdc  PLOGI to D_ID=0x202d2 failed: State:Packet Transport err
or, Reason:No Connection. Giving up
Nevertheless, we went to configure the access point with the cfgadm
command and we saw the following:
root 17: cfgadm -c configure c6
cfgadm: Library error: report LUNs failed: 21000014c34f774b
failed to configure ANY device on FCA port

root 18: cfgadm -c configure c7
cfgadm: Library error: report LUNs failed: 22000014c34f774b
failed to configure ANY device on FCA port
If we have more disks then for some disks it succeeds and for some it
fails (we picked a case where it fails). For the disks that succeed we
don't have the error message in /var/adm/messages. At this point we
can see the disks in the unconfigured state with cfgadm:
root 20: cfgadm -al c6 c7
Ap_Id                          Type         Receptacle   Occupant     Condition
c6                             fc-fabric    connected    unconfigured unknown
c6::21000014c34f774b           unavailable  connected    unconfigured failed
c7                             fc-fabric    connected    unconfigured unknown
c7::22000014c34f774b           unavailable  connected    unconfigured failed
But they obviously aren't usable. So format, luxadm, vxdisk, etc can't
see them. Has anybody seen this before? What is the 'login fabric'
thing doing? And what does the 'report LUNs failed' response from
cfgadm mean? Is that the switch that's not allowing the LUNs to be
reported? Or is the disk not reporting the LUN correct? Thanks.

Karl
Comments
Locked Post
New comments cannot be posted to this locked post.
Post Details
Locked on May 12 2006
Added on Apr 1 2006
3 comments
2,100 views