Status:


SettingValue
Collected at2015-09-19 08:22:27
Collected at GMT2015-09-19 08:22:27
StatusHEALTH_WARN
PG count2752
Pool count13
Used11.9 G
Avail139.6 G
Data17.5 M
Free %92
Mon count3

OSD:


SettingValue
Count5
PG per OSD550
Cluster net192.168.1.0/24
Public net192.168.0.0/24
Near full ratio0.85
Full ratio0.95
Backfill full ratio0.85
Filesafe full ratio0.97
Journal aiotrue
Journal diotrue
Filestorage sync5s

Activity:


SettingValue
Client IO Bps0
Client IO IOPS0

Status messages:


Pool test has too few pgs

Host's info:


NameServicesCPU'sRAM
total
RAM
free
Swap
used
Load avg
5m
node-10osd-1
mon(node-10)
12.9 Gi244.3 Mi1.7 Gi1.21
node-11osd-0
mon(node-11)
12.9 Gi251.8 Mi1.6 Gi1.84
node-12osd-2
mon(node-12)
12.9 Gi238.7 Mi1.3 Gi1.22
node-13osd-412.9 Gi262.3 Mi252 Ki0.24
node-14osd-312.9 Gi279.2 Mi356 Ki0.12

Monitors info:


NameNodeRoleDisk free
B (%)
node-10HEALTH_OKNone5.4 G (36)
node-12HEALTH_OKNone5.5 G (37)
node-11HEALTH_OKNone5.5 G (37)

OSD's state:


StatusCountID's
up5

OSD's info:


OSDnodestatusdaemon
run
weight
reweight
PG countStorage
used
Storage
free
Storage
free %
Journal
on same
disk
Journal
on SSD
Journal
on file
0node-11upyes0.030 / 1.00012562.4 G32.9 G93yesnoyes
1node-10upyes0.030 / 1.00012242.4 G33.0 G93yesnoyes
2node-12upyes0.030 / 1.00012162.4 G33.0 G93yesnoyes
3node-14upyes0.020 / 1.0009092.3 G20.4 G90yesnoyes
4node-13upyes0.020 / 1.0008992.3 G20.3 G90yesnoyes

OSD's load uptime average:


OSDnodeapply
lat, ms
commit
lat, ms
D devD read
Bps
D write
Bps
D read
OPS
D write
OPS
D IO
time %
J devJ read
Bps
J write
Bps
J read
OPS
J write
OPS
J IO
time %
0node-1116632sda95.3 K195.6 K7.510.35sda95.3 K195.6 K7.510.35
1node-10121120sda105.0 K207.5 K7.811.14sda105.0 K207.5 K7.811.14
2node-1212523sda93.0 K197.6 K7.410.75sda93.0 K197.6 K7.410.75
3node-14144sda19.5 K11.0 K4.91.20sda19.5 K11.0 K4.91.20
4node-13267sda19.5 K11.0 K4.91.20sda19.5 K11.0 K4.91.20

OSD's current load:


OSDnodeD devD read
Bps
D write
Bps
D read
OPS
D write
OPS
D lat
ms
D IO
time %
J devJ read
Bps
J write
Bps
J read
OPS
J write
OPS
J lat
ms
J IO
time %
0node-11sda111.5 K335.5 K10.414.496sda111.5 K335.5 K10.414.496
1node-10sda143.3 K219.4 K10.814.665sda143.3 K219.4 K10.814.665
2node-12sda68.4 K185.6 K7.912.4106sda68.4 K185.6 K7.912.4106
3node-14sda19.6 K13.6 K4.91.310sda19.6 K13.6 K4.91.310
4node-13sda19.6 K13.0 K4.91.460sda19.6 K13.0 K4.91.460

Pool's stats:


PoolIdsizemin_sizeobjdatafreereadwriterulesetPGPGPPG per OSD
Dev %
data02100---000646413
metadata12100---000646421
rbd22100---000646423
images321512.7 M---6913025625612
volumes42100---00025625623
backups52100---00025625611
.rgw.root6213840---245025625615
.rgw.control72180---00025625617
.rgw82100---00025625610
.rgw.gc921320---152.3 K101.5 K025625621
.users.uid102100---00025625621
compute112100---00025625614
test122179.4 K4.8 M---2236.5 K025625612

PG's status:


StatusCount%
any2752100.00
active2752100.00
clean2752100.00

PG copy per OSD:


OSD/pool.rgw.rgw.control.rgw.gc.rgw.root.users.uidbackupscomputedataimagesmetadatarbdtestvolumessum
01051071301181271151212912124221211161256
11141271161051121091042611234371011271224
21061131121191211111162910227261121221216
3105857275729386248826208776909
482808295808485208917239171899
sum5125125125125125125121285121281285125125504

Current disk IO load


IOPS


hostload
node-10
sda
25.3
node-11
sda
24.8
node-12
sda
20.3
node-13
sda
6.3
node-14
sda
6.2

Read IOPS


hostload
node-10
sda
10.8
node-11
sda
10.4
node-12
sda
7.9
node-13
sda
4.9
node-14
sda
4.9

Write IOPS


hostload
node-10
sda
14.6
node-11
sda
14.4
node-12
sda
12.4
node-13
sda
1.4
node-14
sda
1.3

Bps


hostload
node-10
sda
362.7 K
node-11
sda
447.0 K
node-12
sda
254.0 K
node-13
sda
32.6 K
node-14
sda
33.2 K

Read Bps


hostload
node-10
sda
143.3 K
node-11
sda
111.5 K
node-12
sda
68.4 K
node-13
sda
19.6 K
node-14
sda
19.6 K

Write Bps


hostload
node-10
sda
219.4 K
node-11
sda
335.5 K
node-12
sda
185.6 K
node-13
sda
13.0 K
node-14
sda
13.6 K

Latency, ms


hostload
node-10
sda
6
node-11
sda
9
node-12
sda
10
node-13
sda
6
node-14
sda
1

Average QD


hostload
node-10
sda
0.2
node-11
sda
0.2
node-12
sda
0.2
node-13
sda
0.0
node-14
sda
0.0

Active time %


hostload
node-10
sda
5.0
node-11
sda
6.0
node-12
sda
6.0
node-13
sda
0.0
node-14
sda
0.0

Network load (to max dev throughput)


Send


hostpublic
net
cluster
net
hw adapterhw adapter
node-10811.997.7 K
eth0
106.2 K
eth1
1.4 K
node-11795.262.1 K
eth0
79.2 K
eth1
1.4 K
node-12818.157.1 K
eth0
65.1 K
eth1
1.3 K
node-13864.31.7 K
eth0
2.7 K
-
node-14799.74.1 K
eth0
5.1 K
-

Receive


hostpublic
net
cluster
net
hw adapterhw adapter
node-10708.065.9 K
eth0
158.5 K
eth1
3.1 K
node-11694.366.1 K
eth0
186.3 K
eth1
3.0 K
node-12709.755.3 K
eth0
200.5 K
eth1
3.1 K
node-13753.51.8 K
eth0
260.3 K
-
node-14695.24.1 K
eth0
258.0 K
-

Host's resource usage:


HostnameCluster net
dev, ip
settings
Cluster net
uptime average
send/recv
Cluster net
current
send/recv
Public net
dev, ip
settings
Public net
uptime average
send/recv
Public net
current
send/recv
NetNetNetNet
node-10br-storage
192.168.1.4
1004.4 / 876.5 Bps
7.8 / 7.7 Pps
811.9 / 708.0 Bps
7.4 / 7.4 Pps
br-mgmt
192.168.0.5
73.8 K / 66.7 K Bps
472.3 / 494.8 Pps
97.7 K / 65.9 K Bps
555.1 / 541.3 Pps
eth0
1.2 Gi
eth1
1.2 Gi
eth2
1.2 Gi
eth3
1.2 Gi
node-11br-storage
192.168.1.2
1012.1 / 936.6 Bps
7.8 / 7.8 Pps
795.2 / 694.3 Bps
7.3 / 7.3 Pps
br-mgmt
192.168.0.7
72.9 K / 57.8 K Bps
407.6 / 466.3 Pps
62.1 K / 66.1 K Bps
376.1 / 468.6 Pps
eth0
1.2 Gi
eth1
1.2 Gi
eth2
1.2 Gi
eth3
1.2 Gi
node-12br-storage
192.168.1.3
1.0 K / 903.3 Bps
7.8 / 7.8 Pps
818.1 / 709.7 Bps
7.5 / 7.5 Pps
br-mgmt
192.168.0.6
68.3 K / 61.1 K Bps
471.3 / 467.6 Pps
57.1 K / 55.3 K Bps
432.0 / 430.4 Pps
eth0
1.2 Gi
eth1
1.2 Gi
eth2
1.2 Gi
eth3
1.2 Gi
node-13br-storage
192.168.1.1
975.2 / 863.0 Bps
7.7 / 7.7 Pps
864.3 / 753.5 Bps
7.9 / 7.9 Pps
br-mgmt
192.168.0.3
1.7 K / 1.7 K Bps
12.2 / 14.5 Pps
1.7 K / 1.8 K Bps
12.6 / 15.0 Pps
eth0
1.2 Gi
eth1
1.2 Gi
eth2
1.2 Gi
eth3
1.2 Gi
node-14br-storage
192.168.1.5
966.5 / 852.9 Bps
7.7 / 7.7 Pps
799.7 / 695.2 Bps
7.3 / 7.3 Pps
br-mgmt
192.168.0.4
3.9 K / 3.9 K Bps
24.8 / 28.6 Pps
4.1 K / 4.1 K Bps
25.7 / 29.3 Pps
eth0
1.2 Gi
eth1
1.2 Gi
eth2
1.2 Gi
eth3
1.2 Gi

Crush weight:


PG's count: