UITableView的强大更多程度上来自于可以任意自定义UITableViewCell单元格。通常,UITableView中的Cell是动态的,在使用过程中,会创建一个Cell池,根据每个cell的高度(即tableView:heightForRowAtIndexPath:返回值),以及屏幕高度计算屏幕中可显示几个cell。而进行自定义TableViewCell无非是采用代码实现或采用IB编辑nib文件来实现两种方式,本文主要收集代码的方式实现各种cell自定义。
如何动态调整Cell高度
- (UITableViewCell *)tableView:(UITableView *)tableView cellForRowAtIndexPath:(NSIndexPath *)indexPath {
static NSString *CellIdentifier = @"Cell";
UITableViewCell *cell = [tableView dequeueReusableCellWithIdentifier:CellIdentifier];
if (cell == nil) {
cell = [[[UITableViewCell alloc] initWithFrame:CGRectZero reuseIdentifier:CellIdentifier] autorelease];
UILabel *label = [[UILabel alloc] initWithFrame:CGRectZero];
label.tag = 1;
label.lineBreakMode = UILineBreakModeWordWrap;
label.highlightedTextColor = [UIColor whiteColor];
label.numberOfLines = 0;
label.opaque = NO; // 选中Opaque表示视图后面的任何内容都不应该绘制
label.backgroundColor = [UIColor clearColor];
[cell.contentView addSubview:label];
[label release];
}
UILabel *label = (UILabel *)[cell viewWithTag:1];
NSString *text;
text = [textArray objectAtIndex:indexPath.row];
CGRect cellFrame = [cell frame];
cellFrame.origin = CGPointMake(0, 0);
label.text = text;
CGRect rect = CGRectInset(cellFrame, 2, 2);
label.frame = rect;
[label sizeToFit];
if (label.frame.size.height > 46) {
cellFrame.size.height = 50 + label.frame.size.height - 46;
}
else {
cellFrame.size.height = 50;
}
[cell setFrame:cellFrame];
return cell;
}
- (CGFloat)tableView:(UITableView *)tableView heightForRowAtIndexPath:(NSIndexPath *)indexPath
{
UITableViewCell *cell = [self tableView:tableView cellForRowAtIndexPath:indexPath];
return cell.frame.size.height;
}
如何用图片自定义Table Separeator分割线
一般地,利用类似[tableView setSeparatorColor:[UIColor redColor]];语句即可修改cell中间分割线的颜色。那又如何用一个图片作为分割线背景呢?可以尝试如下:
方法一:
先设置cell separatorColor为clear,然后把图片做的分割线添加到自定义的custom cell上。
方法二:
在cell里添加一个像素的imageView后将图片载入进,之后设置tableView.separatorStyle = UITableViewCellSeparatorStyleNone
自定义首行Cell与其上面导航栏间距
tableView.tableHeaderView = [[[UIView alloc] initWithFrame:CGRectMake(0,0,5,20)] autorelease];
自定义UITableViewCell的accessory样式
默认的accessoryType属性有四种取值:UITableViewCellAccessoryNone、UITableViewCellAccessoryDisclosureIndicator、UITableViewCellAccessoryDetailDisclosureButton、UITableViewCellAccessoryCheckmark。如果想使用自定义附件按钮的其他样式,则需使用UITableView的accessoryView属性来指定。
UIButton *button;
if(isEditableOrNot) {
UIImage *image = [UIImage imageNamed:@"delete.png"];
button = [UIButton buttonWithType:UIButtonTypeCustom];
CGRect frame = CGRectMake(0.0,0.0,image.size.width,image.size.height);
button.frame = frame;
[button setBackgroundImage:image forState:UIControlStateNormal];
button.backgroundColor = [UIColor clearColor];
cell.accessoryView = button;
}else{
button = [UIButton buttonWithType:UIButtonTypeCustom];
button.backgroundColor = [UIColor clearColor];
cell.accessoryView = button;
}
以上代码仅仅是定义了附件按钮两种状态下的样式,问题是现在这个自定义附件按钮的事件仍不可用。即事件还无法传递到UITableViewDelegate的accessoryButtonTappedForRowWithIndexPath方法上。当我们在上述代码中在加入以下语句:
[button addTarget:self action:@selector(btnClicked:event:) forControlEvents:UIControlEventTouchUpInside];
后,虽然可以捕捉到每个附件按钮的点击事件,但我们还无法进行区别到底是哪一行的附件按钮发生了点击动作!因为addTarget:方法最多允许传递两个参数:target和event,这两个参数都有各自的用途了(target指向事件委托对象,event指向所发生的事件)。看来只依靠Cocoa框架已经无法做到了。
但我们还是可以利用event参数,在自定义的btnClicked方法中判断出事件发生在UITableView的哪一个cell上。因为UITableView有一个很关键的方法indexPathForRowAtPoint,可以根据触摸发生的位置,返回触摸发生在哪一个cell的indexPath。而且通过event对象,正好也可以获得每个触摸在视图中的位置。
// 检查用户点击按钮时的位置,并转发事件到对应的accessory tapped事件
- (void)btnClicked:(id)sender event:(id)event
{
NSSet *touches = [event allTouches];
UITouch *touch = [touches anyObject];
CGPoint currentTouchPosition = [touch locationInView:self.tableView];
NSIndexPath *indexPath = [self.tableView indexPathForRowAtPoint:currentTouchPosition];
if(indexPath != nil)
{
[self tableView:self.tableView accessoryButtonTappedForRowWithIndexPath:indexPath];
}
}
这样,UITableView的accessoryButtonTappedForRowWithIndexPath方法会被触发,并且获得一个indexPath参数。通过这个indexPath参数,我们即可区分到底哪一行的附件按钮发生了触摸事件。
- (void)tableView:(UITableView *)tableView accessoryButtonTappedForRowWithIndexPath:(NSIndexPath *)indexPath
{
int *idx = indexPath.row;
//这里加入自己的逻辑
}
来自:http://www.cnblogs.com/lovecode/archive/2012/01/07/2315630.html
前边所写的QT图片浏览器程序是在window下QT Creator环境中编写的,在window上面运行效果很正确,但是因为是需要在linux下面的QT环境中编译,所以在linux下面编译以后运行出现的效果却不是理想想过,现象是在打开linux下面的图片以后,上下翻转不按文件夹中已经排列好的顺序显示,而是一个固定的另一个顺序,进而导致删除也不成功,最后更改了遍历打开图片所在文件夹下的全部图片这个功能函数的代码,换了一种遍历方法,最后得到了理想的效果。代码如下:
csimagescan.cpp:
#include "csimagescan.h"
#include "ui_csimagescan.h"
csImageScan::csImageScan(QWidget *parent) :
QDialog(parent),
ui(new Ui::csImageScan)
{
ui->setupUi(this);
position =0;
left=0;
right=0;
num_Roll = 0;
setWindowFlags(Qt::FramelessWindowHint);//去掉标题栏
int screen_W=QApplication::desktop()->width();
int screen_H=QApplication::desktop()->height();
resize(screen_W,screen_H);
//int win_W=width();
//int win_H=height();
//
printf("abcdefghigk=====%d,------12345678910========%d-----------\n\n\n\n",screen_W,screen_H);
/*
QPushButton pbexit = new QPushButton(this);
pbexit->setObjectName(QString::fromUtf8("pbexit"));
pbexit->setGeometry(QRect(803,728,40,40));
pbexit->setFocusPolicy(Qt::NoFocus);
pbexit->setFlat(true);
*/
ui->pre->setGeometry(523,728,40,40);
ui->next->setGeometry(603,728,40,40);
//ui->rotuteleft->setGeometry(203,728,40,40);
//ui->rotuteright->setGeometry(283,728,40,40);
ui->pbdelete->setGeometry(723,728,40,40);
ui->pbexit->setGeometry(QRect(803,728,40,40));
ui->rotuteleft->setEnabled(false);
ui->rotuteright->setEnabled(false);
QPixmap m_pixmap("res/pre.png");
QIcon m_icon;
m_icon.addPixmap(m_pixmap);
ui->pre->setIcon(m_icon);
ui->pre->setIconSize(QSize(40,40));
m_pixmap.load("res/next.png");
m_icon.addPixmap(m_pixmap);
ui->next->setIcon(m_icon);
ui->next->setIconSize(QSize(40,40));
/*
m_pixmap.load("res/left.png");
m_icon.addPixmap(m_pixmap);
ui->rotuteleft->setIcon(m_icon);
ui->rotuteleft->setIconSize(QSize(60,60));
m_pixmap.load("res/right.png");
m_icon.addPixmap(m_pixmap);
ui->rotuteright->setIcon(m_icon);
ui->rotuteright->setIconSize(QSize(60,60));
*/
m_pixmap.load("res/imgdelete.png");
m_icon.addPixmap(m_pixmap);
ui->pbdelete->setIcon(m_icon);
ui->pbdelete->setIconSize(QSize(40,40));
m_pixmap.load("res/exit.png");
m_icon.addPixmap(m_pixmap);
ui->pbexit->setIcon(m_icon);
ui->pbexit->setIconSize(QSize(40,40));
}
csImageScan::~csImageScan()
{
delete ui;
}
/*
void csImageScan::drawImageOnSrc(QString & str)
{
//
ui->label->setPixmap(QPixmap(str));
//
}
*/
//图片浏览对话框关闭按钮槽
void csImageScan::on_pbexit_clicked()
{
this->close();
}
//打开图片浏览功能
void csImageScan::on_open_clicked()
{
position =0;
stringlist.clear();
// printf("%s\n",filepath.toAscii().constData());
filepath = QFileDialog::getOpenFileName(NULL, QObject::tr("Open Image"), ".",
QObject::tr("Image Files(*.jpg *.png *.bmp)"));
filelist = filepath.split("/");
filelist.removeLast ();
folderpath = filelist.join("/");
setpixmap(filepath);
printf("-----%d------\n",position);
}
//遍历打开图片所在文件夹下的全部图片
void csImageScan::Open_File()
{
QDir dir(folderpath);
QFileInfoList fileList;
QFileInfo curFile;
QStringList filters;
//filters << "*.jpg"<<"*.png";
filters << "*.jpg"<<"*.png"<<"*.bmp";
fileList=dir.entryInfoList(filters,QDir::Dirs|QDir::Files
|QDir::Readable|QDir::Writable
|QDir::Hidden|QDir::NoDotAndDotDot
,QDir::Name);
int i=0;
if(fileList.size()>0)
{
//curFile=fileList[position];
while(i<fileList.size())
{
curFile=fileList[i];
if(filepath == curFile.filePath())
position = i;
stringlist<<curFile.filePath();
printf("----delete----%d----------\n",i);
printf("%s----\n",curFile.filePath().toAscii().constData());
//printf("%s----\n",filepath.toAscii().constData());
i++;
}
}
}
//图片前翻
void csImageScan::on_pre_clicked()
{
left=0;
right=0;
num_Roll = 0;
if(0 == stringlist.size())
{
printf("get pre image erron \n");
return ;
}
position--;
if(position<0)
{
position=stringlist.size()-1;
}
filepath = stringlist.at(position);
setpixmap(filepath);
printf("---pre----%d-------------------\n",position);
printf("%s----\n",filepath.toAscii().constData());
}
//图片后翻
void csImageScan::on_next_clicked()
{
left=0;
right=0;
num_Roll = 0;
if(0 == stringlist.size())
{
printf("get next image erron \n");
return ;
}
position++;
if(position>stringlist.size()-1)
{
position=0;
}
filepath = stringlist.at(position);
setpixmap(filepath);
printf("---next----%d-------------------\n",position);
printf("%s----\n",filepath.toAscii().constData());
}
//图片逆时针旋转
void csImageScan::on_rotuteleft_clicked()
{
if(0 == stringlist.size())
{
printf("get next image erron \n");
return ;
}
left=1;
if(left==1&&right==1)
{
num_Roll--;
//QImage image(stringlist.at(position));
QMatrix matrix;
//matrix.rotate(-90.0*num_Roll);
matrix.rotate(-90.0);
image = image.transformed(matrix,Qt::FastTransformation);
pix = pix.fromImage(image);
update();
}
else
{
num_Roll++;
//QImage image(stringlist.at(position));
image.load(stringlist.at(position));
QMatrix matrix;
matrix.rotate(-90.0*num_Roll);
image = image.transformed(matrix,Qt::FastTransformation);
pix = pix.fromImage(image);
update();
}
}
//图片顺时针旋转
void csImageScan::on_rotuteright_clicked()
{
if(0 == stringlist.size())
{
printf("get next image erron \n");
return ;
}
right=1;
if(left==1&&right==1)
{
//num_Roll--;
//QImage image(stringlist.at(position));
QMatrix matrix;
//matrix.rotate(90.0*num_Roll);
matrix.rotate(90.0);
image = image.transformed(matrix,Qt::FastTransformation);
pix = pix.fromImage(image);
update();
}
else
{
num_Roll++;
//QImage image(stringlist.at(position));
image.load(stringlist.at(position));
QMatrix matrix;
matrix.rotate(90.0*num_Roll);
image = image.transformed(matrix,Qt::FastTransformation);
pix = pix.fromImage(image);
update();
}
}
//图片删除
void csImageScan::on_pbdelete_clicked()
{
if(stringlist.size() == 0)
return;
QMessageBox::StandardButton rb = QMessageBox::question(NULL,"Warning",
"Do you want to delete the picture?",QMessageBox::Yes | QMessageBox::No, QMessageBox::Yes);
if(rb == QMessageBox::Yes)
{
stringlist.takeAt(position);
QDir dir(folderpath);
QFileInfoList fileList;
QFileInfo curFile;
QStringList filters;
filters << "*.jpg"<<"*.png"<<"*.bmp";
fileList=dir.entryInfoList(filters,QDir::Dirs|QDir::Files
|QDir::Readable|QDir::Writable
|QDir::Hidden|QDir::NoDotAndDotDot
,QDir::Name);
if(fileList.size()>0)
{
curFile=fileList[position];
QFile fileTemp(curFile.filePath());
fileTemp.remove();
fileList.removeAt(position);
//QFile filetemp(curFile.filePath().remove(QString("_screen"), Qt::CaseSensitive));
//filetemp.remove();
printf("----delete----%d----------\n",position);
printf("%s----\n",curFile.filePath().toAscii().constData());
}
if(0 == stringlist.size())
{
printf("get next image erron \n");
return ;
}
else
{
filepath = stringlist.at(position);
setpixmap(filepath);
}
}
}
//重绘事件
void csImageScan::paintEvent ( QPaintEvent * )
{
QPainter painter(this);
painter.drawPixmap(43,0,pix);
}
void csImageScan::setpixmap(QString imageName)
{
pix.load(imageName);
update();
}
其他地方的代码完全相同。
前面的文章中提到了通过RTSP(Real Time Streaming Protocol)的方式来实现视频的直播,但RTSP方式的一个弊端是如果需要支持客户端通过网页来访问,就需要在在页面中嵌入一个ActiveX控件,而ActiveX一般都需要签名才能正常使用,否则用户在使用时还需要更改浏览器设置,并且ActiveX还只支持IE内核的浏览器,Chrome、FireFox需要IE插件才能运行,因此会特别影响用户体验。而RTMP(Real Time Messaging Protocol)很好的解决了这一个问题。由于RTMP是针对FLASH的流媒体协议,视频通过RTMP直播后,只需要在WEB上嵌入一个Web Player(如Jwplayer)即可观看,而且对平台也没什么限制,还可以方便的通过手机观看。
视频通过RTMP方式发布需要一个RTMP Server(常见的有FMS、Wowza Media Server, 开源的有CRtmpServer、Red5等),原始视频只要按照RTMP协议发送给RTMP Server就可以RTMP视频流的发布了。为了便于视频的打包发布,封装了一个RTMPStream,目前只支持发送H264的视频文件。可以直接发送H264数据帧或H264文件,RTMPStream提供的接口如下。
class CRTMPStream
{
public:
CRTMPStream(void);
~CRTMPStream(void);
public:
// 连接到RTMP Server
bool Connect(const char* url);
// 断开连接
void Close();
// 发送MetaData
bool SendMetadata(LPRTMPMetadata lpMetaData);
// 发送H264数据帧
bool SendH264Packet(unsigned char *data,unsigned int size,bool bIsKeyFrame,unsigned int nTimeStamp);
// 发送H264文件
bool SendH264File(const char *pFileName);
//...
} 调用示例:#include <stdio.h>
#include "RTMPStream\RTMPStream.h"
int main(int argc,char* argv[])
{
CRTMPStream rtmpSender;
bool bRet = rtmpSender.Connect("rtmp://192.168.1.104/live/test");
rtmpSender.SendH264File("E:\\video\\test.264");
rtmpSender.Close();
} 通过JwPlayer播放效果如下:
最后附上RTMPStream完整的代码:
/********************************************************************
filename: RTMPStream.h
created: 2013-04-3
author: firehood
purpose: 发送H264视频到RTMP Server,使用libRtmp库
*********************************************************************/
#pragma once
#include "rtmp.h"
#include "rtmp_sys.h"
#include "amf.h"
#include <stdio.h>
#define FILEBUFSIZE (1024 * 1024 * 10) // 10M
// NALU单元
typedef struct _NaluUnit
{
int type;
int size;
unsigned char *data;
}NaluUnit;
typedef struct _RTMPMetadata
{
// video, must be h264 type
unsigned int nWidth;
unsigned int nHeight;
unsigned int nFrameRate; // fps
unsigned int nVideoDataRate; // bps
unsigned int nSpsLen;
unsigned char Sps[1024];
unsigned int nPpsLen;
unsigned char Pps[1024];
// audio, must be aac type
bool bHasAudio;
unsigned int nAudioSampleRate;
unsigned int nAudioSampleSize;
unsigned int nAudioChannels;
char pAudioSpecCfg;
unsigned int nAudioSpecCfgLen;
} RTMPMetadata,*LPRTMPMetadata;
class CRTMPStream
{
public:
CRTMPStream(void);
~CRTMPStream(void);
public:
// 连接到RTMP Server
bool Connect(const char* url);
// 断开连接
void Close();
// 发送MetaData
bool SendMetadata(LPRTMPMetadata lpMetaData);
// 发送H264数据帧
bool SendH264Packet(unsigned char *data,unsigned int size,bool bIsKeyFrame,unsigned int nTimeStamp);
// 发送H264文件
bool SendH264File(const char *pFileName);
private:
// 送缓存中读取一个NALU包
bool ReadOneNaluFromBuf(NaluUnit &nalu);
// 发送数据
int SendPacket(unsigned int nPacketType,unsigned char *data,unsigned int size,unsigned int nTimestamp);
private:
RTMP* m_pRtmp;
unsigned char* m_pFileBuf;
unsigned int m_nFileBufSize;
unsigned int m_nCurPos;
};
/********************************************************************
filename: RTMPStream.cpp
created: 2013-04-3
author: firehood
purpose: 发送H264视频到RTMP Server,使用libRtmp库
*********************************************************************/
#include "RTMPStream.h"
#include "SpsDecode.h"
#ifdef WIN32
#include <windows.h>
#endif
enum
{
FLV_CODECID_H264 = 7,
};
int InitSockets()
{
#ifdef WIN32
WORD version;
WSADATA wsaData;
version = MAKEWORD(1, 1);
return (WSAStartup(version, &wsaData) == 0);
#else
return TRUE;
#endif
}
inline void CleanupSockets()
{
#ifdef WIN32
WSACleanup();
#endif
}
char * put_byte( char *output, uint8_t nVal )
{
output[0] = nVal;
return output+1;
}
char * put_be16(char *output, uint16_t nVal )
{
output[1] = nVal & 0xff;
output[0] = nVal >> 8;
return output+2;
}
char * put_be24(char *output,uint32_t nVal )
{
output[2] = nVal & 0xff;
output[1] = nVal >> 8;
output[0] = nVal >> 16;
return output+3;
}
char * put_be32(char *output, uint32_t nVal )
{
output[3] = nVal & 0xff;
output[2] = nVal >> 8;
output[1] = nVal >> 16;
output[0] = nVal >> 24;
return output+4;
}
char * put_be64( char *output, uint64_t nVal )
{
output=put_be32( output, nVal >> 32 );
output=put_be32( output, nVal );
return output;
}
char * put_amf_string( char *c, const char *str )
{
uint16_t len = strlen( str );
c=put_be16( c, len );
memcpy(c,str,len);
return c+len;
}
char * put_amf_double( char *c, double d )
{
*c++ = AMF_NUMBER; /* type: Number */
{
unsigned char *ci, *co;
ci = (unsigned char *)&d;
co = (unsigned char *)c;
co[0] = ci[7];
co[1] = ci[6];
co[2] = ci[5];
co[3] = ci[4];
co[4] = ci[3];
co[5] = ci[2];
co[6] = ci[1];
co[7] = ci[0];
}
return c+8;
}
CRTMPStream::CRTMPStream(void):
m_pRtmp(NULL),
m_nFileBufSize(0),
m_nCurPos(0)
{
m_pFileBuf = new unsigned char[FILEBUFSIZE];
memset(m_pFileBuf,0,FILEBUFSIZE);
InitSockets();
m_pRtmp = RTMP_Alloc();
RTMP_Init(m_pRtmp);
}
CRTMPStream::~CRTMPStream(void)
{
Close();
WSACleanup();
delete[] m_pFileBuf;
}
bool CRTMPStream::Connect(const char* url)
{
if(RTMP_SetupURL(m_pRtmp, (char*)url)<0)
{
return FALSE;
}
RTMP_EnableWrite(m_pRtmp);
if(RTMP_Connect(m_pRtmp, NULL)<0)
{
return FALSE;
}
if(RTMP_ConnectStream(m_pRtmp,0)<0)
{
return FALSE;
}
return TRUE;
}
void CRTMPStream::Close()
{
if(m_pRtmp)
{
RTMP_Close(m_pRtmp);
RTMP_Free(m_pRtmp);
m_pRtmp = NULL;
}
}
int CRTMPStream::SendPacket(unsigned int nPacketType,unsigned char *data,unsigned int size,unsigned int nTimestamp)
{
if(m_pRtmp == NULL)
{
return FALSE;
}
RTMPPacket packet;
RTMPPacket_Reset(&packet);
RTMPPacket_Alloc(&packet,size);
packet.m_packetType = nPacketType;
packet.m_nChannel = 0x04;
packet.m_headerType = RTMP_PACKET_SIZE_LARGE;
packet.m_nTimeStamp = nTimestamp;
packet.m_nInfoField2 = m_pRtmp->m_stream_id;
packet.m_nBodySize = size;
memcpy(packet.m_body,data,size);
int nRet = RTMP_SendPacket(m_pRtmp,&packet,0);
RTMPPacket_Free(&packet);
return nRet;
}
bool CRTMPStream::SendMetadata(LPRTMPMetadata lpMetaData)
{
if(lpMetaData == NULL)
{
return false;
}
char body[1024] = {0};;
char * p = (char *)body;
p = put_byte(p, AMF_STRING );
p = put_amf_string(p , "@setDataFrame" );
p = put_byte( p, AMF_STRING );
p = put_amf_string( p, "onMetaData" );
p = put_byte(p, AMF_OBJECT );
p = put_amf_string( p, "copyright" );
p = put_byte(p, AMF_STRING );
p = put_amf_string( p, "firehood" );
p =put_amf_string( p, "width");
p =put_amf_double( p, lpMetaData->nWidth);
p =put_amf_string( p, "height");
p =put_amf_double( p, lpMetaData->nHeight);
p =put_amf_string( p, "framerate" );
p =put_amf_double( p, lpMetaData->nFrameRate);
p =put_amf_string( p, "videocodecid" );
p =put_amf_double( p, FLV_CODECID_H264 );
p =put_amf_string( p, "" );
p =put_byte( p, AMF_OBJECT_END );
int index = p-body;
SendPacket(RTMP_PACKET_TYPE_INFO,(unsigned char*)body,p-body,0);
int i = 0;
body[i++] = 0x17; // 1:keyframe 7:AVC
body[i++] = 0x00; // AVC sequence header
body[i++] = 0x00;
body[i++] = 0x00;
body[i++] = 0x00; // fill in 0;
// AVCDecoderConfigurationRecord.
body[i++] = 0x01; // configurationVersion
body[i++] = lpMetaData->Sps[1]; // AVCProfileIndication
body[i++] = lpMetaData->Sps[2]; // profile_compatibility
body[i++] = lpMetaData->Sps[3]; // AVCLevelIndication
body[i++] = 0xff; // lengthSizeMinusOne
// sps nums
body[i++] = 0xE1; //&0x1f
// sps data length
body[i++] = lpMetaData->nSpsLen>>8;
body[i++] = lpMetaData->nSpsLen&0xff;
// sps data
memcpy(&body[i],lpMetaData->Sps,lpMetaData->nSpsLen);
i= i+lpMetaData->nSpsLen;
// pps nums
body[i++] = 0x01; //&0x1f
// pps data length
body[i++] = lpMetaData->nPpsLen>>8;
body[i++] = lpMetaData->nPpsLen&0xff;
// sps data
memcpy(&body[i],lpMetaData->Pps,lpMetaData->nPpsLen);
i= i+lpMetaData->nPpsLen;
return SendPacket(RTMP_PACKET_TYPE_VIDEO,(unsigned char*)body,i,0);
}
bool CRTMPStream::SendH264Packet(unsigned char *data,unsigned int size,bool bIsKeyFrame,unsigned int nTimeStamp)
{
if(data == NULL && size<11)
{
return false;
}
unsigned char *body = new unsigned char[size+9];
int i = 0;
if(bIsKeyFrame)
{
body[i++] = 0x17;// 1:Iframe 7:AVC
}
else
{
body[i++] = 0x27;// 2:Pframe 7:AVC
}
body[i++] = 0x01;// AVC NALU
body[i++] = 0x00;
body[i++] = 0x00;
body[i++] = 0x00;
// NALU size
body[i++] = size>>24;
body[i++] = size>>16;
body[i++] = size>>8;
body[i++] = size&0xff;;
// NALU data
memcpy(&body[i],data,size);
bool bRet = SendPacket(RTMP_PACKET_TYPE_VIDEO,body,i+size,nTimeStamp);
delete[] body;
return bRet;
}
bool CRTMPStream::SendH264File(const char *pFileName)
{
if(pFileName == NULL)
{
return FALSE;
}
FILE *fp = fopen(pFileName, "rb");
if(!fp)
{
printf("ERROR:open file %s failed!",pFileName);
}
fseek(fp, 0, SEEK_SET);
m_nFileBufSize = fread(m_pFileBuf, sizeof(unsigned char), FILEBUFSIZE, fp);
if(m_nFileBufSize >= FILEBUFSIZE)
{
printf("warning : File size is larger than BUFSIZE\n");
}
fclose(fp);
RTMPMetadata metaData;
memset(&metaData,0,sizeof(RTMPMetadata));
NaluUnit naluUnit;
// 读取SPS帧
ReadOneNaluFromBuf(naluUnit);
metaData.nSpsLen = naluUnit.size;
memcpy(metaData.Sps,naluUnit.data,naluUnit.size);
// 读取PPS帧
ReadOneNaluFromBuf(naluUnit);
metaData.nPpsLen = naluUnit.size;
memcpy(metaData.Pps,naluUnit.data,naluUnit.size);
// 解码SPS,获取视频图像宽、高信息
int width = 0,height = 0;
h264_decode_sps(metaData.Sps,metaData.nSpsLen,width,height);
metaData.nWidth = width;
metaData.nHeight = height;
metaData.nFrameRate = 25;
// 发送MetaData
SendMetadata(&metaData);
unsigned int tick = 0;
while(ReadOneNaluFromBuf(naluUnit))
{
bool bKeyframe = (naluUnit.type == 0x05) ? TRUE : FALSE;
// 发送H264数据帧
SendH264Packet(naluUnit.data,naluUnit.size,bKeyframe,tick);
msleep(40);
tick +=40;
}
return TRUE;
}
bool CRTMPStream::ReadOneNaluFromBuf(NaluUnit &nalu)
{
int i = m_nCurPos;
while(i<m_nFileBufSize-4)
{
if(m_pFileBuf[i++] == 0x00 &&
m_pFileBuf[i++] == 0x00 &&
m_pFileBuf[i++] == 0x00 &&
m_pFileBuf[i++] == 0x01
)
{
int pos = i;
while (pos<m_nFileBufSize-4)
{
if(m_pFileBuf[pos++] == 0x00 &&
m_pFileBuf[pos++] == 0x00 &&
m_pFileBuf[pos++] == 0x00 &&
m_pFileBuf[pos++] == 0x01
)
{
break;
}
}
nalu.type = m_pFileBuf[i]&0x1f;
nalu.size = (pos-4)-i;
nalu.data = &m_pFileBuf[i];
m_nCurPos = pos-4;
return TRUE;
}
}
return FALSE;
}