前言
這也是本人第一次琢磨關(guān)于即時通訊方面的內(nèi)容,結(jié)合網(wǎng)上查看的相關(guān)資料搭建出來的仿微信小demo,如有意見請多多指教
具體項目可以在githubWeiChat下載進(jìn)行查看,如若各位看官老爺覺得還可以請點(diǎn)star
續(xù)前篇(十)即時通訊之XMPPFramework登陸注冊
續(xù)前篇(十一)即時通訊之XMPPFramework電子名片
續(xù)前篇(十二)即時通訊之XMPPFramework花名冊
**Attention,這里需要改成自己的服務(wù)器地址和端口號
并且數(shù)據(jù)庫和服務(wù)器一定要開啟喲,不然沒法登陸的喲**
#ifdef DEBUG
#define kHostName @"192.168.199.111"
//#define kHostName @"127.0.0.1"
#define kHostPort 5222
#else
#define kHostNanme @""
#define kHostPort 5222
#endif
先來談?wù)勛隽奶旃δ艿母邢?其實吧,這個也不復(fù)雜,主要是界面搭建過于麻煩,先來說功能實現(xiàn):
1.關(guān)于聊天功能,當(dāng)日這個包括有文字,語音,圖片,還有文件傳輸,視頻音頻聊天,這些我就沒寫了,就實現(xiàn)了文字,語音,圖片這三個部分,后面的再寫也沒多大意思了.
2.其實關(guān)于聊天說到底就一個方法sendElement:(NSXMLElement *)element,后面跟上你要發(fā)送的內(nèi)容就行了.
3.既然有各種不同的聊天內(nèi)容,那就要加上關(guān)鍵字進(jìn)行判斷來區(qū)分是文字是語音還是圖片.
4.這區(qū)分不同內(nèi)容也有好幾種方式
5.對于發(fā)送內(nèi)容有兩種方式,當(dāng)日是對于語音圖片,這種大容量的內(nèi)容來說的,第一種是使用XML來攜帶內(nèi)容,這種方式對服務(wù)器壓力很大,因為一旦攜帶的內(nèi)容過大,那么就會使傳輸速度變得很慢.另一種是通過文件服務(wù)器的方式,通過put將要傳輸方式發(fā)送到文件服務(wù)器,然后發(fā)送URL就可以了,對于文件的傳輸與接收都是通過上傳下載來完成.這兩種方式都有實現(xiàn),可以看demo.
6.聊天功能最重要的就是你要知道你傳輸?shù)膍essage的結(jié)構(gòu)是什么樣的,由于是用XML組織的,所以很容易就能知道你傳輸?shù)膬?nèi)容組織,這樣能方便你進(jìn)行區(qū)分傳輸內(nèi)容以及擴(kuò)展你想要的一些功能,還要做一些回執(zhí)操作,以及類似心跳包之類的操作.
再來說說界面實現(xiàn):
1.首先重中之重是得選擇一個好的設(shè)計模式了,這里采用MVVM,將原先的數(shù)據(jù)模型分為數(shù)據(jù)模型加frame模型,其業(yè)務(wù)邏輯分開,在獲取數(shù)據(jù)時,就對數(shù)據(jù)和frame進(jìn)行計算,再在單元格初始化的時候進(jìn)行賦值.
2.其次是frame的計算,因為有文字,圖片,語音所以得慢慢計算單元格的自適應(yīng)高度.
3.對于鍵盤的處理,這個其實蠻復(fù)雜的,尤其是點(diǎn)擊發(fā)文件按鈕后,彈出的一個自定義模塊,這個你們看demo吧,不過多闡述了,這里主要講的是功能實現(xiàn),界面實現(xiàn)就發(fā)揮你們的想象力好了.
4.另外還有對于界面處理的諸多細(xì)節(jié),demo中也有部分未處理好,萬望諒解.
看圖:
這里高度有點(diǎn)不對,但是并沒找到原因
注意,這里需要模擬器和真機(jī)進(jìn)行測試,展示給讀者的是模擬器,真機(jī)就沒有展示出來了
丑話說在中間,聊天界面還存在有一個bug,暫且還沒找出來為什么,但不影響使用
本篇講的是聊天功能中的聊天功能的實現(xiàn),內(nèi)容有點(diǎn)長,請耐心觀看.
1.聊天模塊激活
沒什么好說的,跟前面的一樣,聊天模塊激活
/**
* XMPPMessageArchiving聊天模塊激活
*/
self.archivingStorage = [XMPPMessageArchivingCoreDataStorage sharedInstance];
self.messageArchiving = [[XMPPMessageArchiving alloc] initWithMessageArchivingStorage:self.archivingStorage dispatchQueue:dispatch_get_main_queue()];
[self.messageArchiving activate:self.stream];
2.發(fā)送文字聊天內(nèi)容
#pragma mark - TextView的代理方法 點(diǎn)擊renturn發(fā)送信息
#pragma mark 發(fā)送文字聊天信息
- (void)textViewDidChange:(UITextView *)textView {
if ([textView.text hasSuffix:@"\n"]) {
NSLog(@"已經(jīng)發(fā)送消息");
[self sendMessageWithText:textView.text bodyType:@"text"];
textView.text = nil;
}
}
- (void)sendMessageWithText:(NSString *)text bodyType:(NSString *)type {
// XMPPMessage *message = [XMPPMessage messageWithType:@"chat" to:self.jidChatTo.jid];
// // 設(shè)置bodyType為text
// [message addAttributeWithName:@"bodyType" stringValue:type];
// [message addBody:text];
// [[XMPPManager sharedmanager].stream sendElement:message];
XMPPMessage* message = [[XMPPMessage alloc] initWithType:@"chat" to:self.jidChatTo.jid];
[message addBody:type];
// 設(shè)置節(jié)點(diǎn)內(nèi)容
XMPPElement *attachment = [XMPPElement elementWithName:@"attachment" stringValue:text];
// 包含子節(jié)點(diǎn)
[message addChild:attachment];
[[XMPPManager sharedmanager].stream sendElement:message];
}
3.獲取文字聊天內(nèi)容
- (void)relodChatMessage {
XMPPManager *manager = [XMPPManager sharedmanager];
NSManagedObjectContext *context = manager.archivingStorage.mainThreadManagedObjectContext;
NSFetchRequest *request = [NSFetchRequest fetchRequestWithEntityName:@"XMPPMessageArchiving_Message_CoreDataObject"];
NSPredicate *predicate = [NSPredicate predicateWithFormat:@"streamBareJidStr=%@ AND bareJidStr=%@",[UserManager sharedmanager].jid,self.jidChatTo.jid.bare];
NSSortDescriptor *timeSort = [NSSortDescriptor sortDescriptorWithKey:@"timestamp" ascending:YES];
request.sortDescriptors = @[timeSort];
request.predicate = predicate;
self.resultController = [[NSFetchedResultsController alloc] initWithFetchRequest:request managedObjectContext:context sectionNameKeyPath:nil cacheName:nil];
self.resultController.delegate = self;
NSError *error = nil;
if ([self.resultController performFetch:&error]) {
NSLog(@"%@",error);
}
//NSLog(@"-----%@",self.resultController.fetchedObjects);
[self getChatMsgArray];
}
#pragma mark - NSFetchedResultsControllerDelegate
// 代理方法,當(dāng)數(shù)據(jù)發(fā)生變化時調(diào)用該方法
- (void)controller:(NSFetchedResultsController *)controller didChangeObject:(id)anObject atIndexPath:(nullable NSIndexPath *)indexPath forChangeType:(NSFetchedResultsChangeType)type newIndexPath:(nullable NSIndexPath *)newIndexPath {
[self relodChatMessage];
}
// 此處獲取數(shù)據(jù)庫中的會話數(shù)組,對模型類進(jìn)行賦值,并且設(shè)置其單個會話的frame
/**
* 采用MVVM的設(shè)計模式
提供兩個模型:
>數(shù)據(jù)模型:存放文字?jǐn)?shù)據(jù)\圖片數(shù)據(jù)
>frame模型:存放數(shù)據(jù)模型\所有子控件的frame\cell的高度
其中cell直接擁有一個frame模型(不要直接擁有數(shù)據(jù)模型).即:使fram模型擁有數(shù)據(jù)模型
在cell賦值的時候直接賦值frame模型
*/
- (void)getChatMsgArray {
[self.chatMsgArray removeAllObjects];
for (XMPPMessageArchiving_Message_CoreDataObject *msg in self.resultController.fetchedObjects) {
ChatMessageModel *messageModel = [[ChatMessageModel alloc] init];
// 將模型中的上一條信息的時間戳取出來放到數(shù)據(jù)模型中處理
if (self.chatMsgArray.count) {
ChatFrameModel *preChatFrameModel = self.chatMsgArray.lastObject;
messageModel.preMsgDate = preChatFrameModel.msg.msg.timestamp;
}
// 數(shù)據(jù)模型的setter
messageModel.msg = msg;
ChatFrameModel *frameModel = [[ChatFrameModel alloc] init];
// frame模型的setter
frameModel.msg = messageModel;
[self.chatMsgArray addObject:frameModel];
// 圖片瀏覽器
if ([msg.message.body isEqualToString:@"image"]) {
XMPPElement *node = msg.message.children.lastObject;
// 取出消息的解碼
NSString *base64str = node.stringValue;
NSData *data = [[NSData alloc]initWithBase64EncodedString:base64str options:0];
UIImage *image = [[UIImage alloc]initWithData:data];
[self.chatImageArray addObject:image];
}
}
[self.myTab reloadData];
[self scrollToBottom];
}
關(guān)于業(yè)務(wù)邏輯我就不展示了,感興趣的可以查看demo
4.發(fā)送語音聊天
#pragma mark ******************************
#pragma mark -- 發(fā)送語音聊天信息
- (IBAction)sendVoiceBtn:(UIButton *)sender {
if (CGRectGetMaxY(self.moreView.frame) == [UIScreen mainScreen].bounds.size.height) {
self.moreView.frame = kMoreInputViewOriFrame;
[self.chatTextView resignFirstResponder];
}
if (self.inputViewBottonConstraint.constant == 200) {
[self.chatTextView becomeFirstResponder];
[self dissmissMoreInputViewWithAniation:YES];
}
if (!self.sendVoiceBtn.hidden) {
[self.chatTextView becomeFirstResponder];
//self.inputViewBottonConstraint.constant = 0;
} else {
if ([self.chatTextView isFirstResponder]) {
[self.chatTextView resignFirstResponder];
}
}
self.sendVoiceBtn.hidden = sender.selected;
sender.selected = !sender.selected;
UIImage *normalImage = sender.selected ? [UIImage imageNamed:@"ToolViewKeyboard"] : [UIImage imageNamed:@"ToolViewInputVoice"];
UIImage *highlightImage = sender.selected ? [UIImage imageNamed:@"ToolViewKeyboardHL"] : [UIImage imageNamed:@"ToolViewInputVoiceHL"];
[sender setImage:normalImage forState:UIControlStateNormal];
[sender setImage:highlightImage forState:UIControlStateHighlighted];
}
// 在按鈕上按下按鈕開始錄音
- (IBAction)sendTouchDown:(UIButton *)sender {
NSLog(@"%s, line = %d", __FUNCTION__, __LINE__);
self.sendVoiceBtn.backgroundColor = [UIColor lightGrayColor];
// 設(shè)置提示框
popVoiceView *voiceV = [popVoiceView voiceAlertPopView];
voiceV.bounds = CGRectMake(0, 0, 150, 150);
CGFloat centerX = [UIScreen mainScreen].bounds.size.width / 2.0;
CGFloat centerY = [UIScreen mainScreen].bounds.size.height / 2.0;
voiceV.center = CGPointMake(centerX, centerY);
self.voiceView = voiceV;
[self.view addSubview:self.voiceView];
// 錄音
[self.audioRecorder record];
}
// 在按鈕上抬起手指發(fā)送語音
- (IBAction)sendTouchUpInside:(UIButton *)sender {
NSLog(@"%s, line = %d", __FUNCTION__, __LINE__);
self.sendVoiceBtn.backgroundColor = BackGround243Color;
NSTimeInterval time = self.audioRecorder.currentTime;
if (time < 1.5) {
// 時間小于1.5秒不發(fā)送,大于1.5秒才發(fā)送
// 停止錄音
[self.audioRecorder stop];
// 刪除錄音文件
[self.audioRecorder deleteRecording];
self.voiceView.voiceImageV.image = [UIImage imageNamed:@"QQ20160818-3"];
self.voiceView.voiceTitleLab.text = @"說話時間太短";
} else {
// 停止錄音
[self.audioRecorder stop];
// 發(fā)送語音
NSString *urlStr = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
urlStr = [urlStr stringByAppendingPathComponent:kRecordAudioFile];
NSData *voiceData = [NSData dataWithContentsOfFile:urlStr];
[self sendVoiceMessageWithData:voiceData bodyType:@"voice" withDuringTime:time];
}
[self.voiceView removeFromSuperview];
}
- (void)sendVoiceMessageWithData:(NSData *)data bodyType:(NSString *)type withDuringTime:(NSTimeInterval)time{
XMPPMessage* message = [[XMPPMessage alloc] initWithType:@"chat" to:self.jidChatTo.jid];
// 將時間傳過去
NSString *timeStr = [NSString stringWithFormat:@"%f",time];
[message addAttributeWithName:@"duringTime" stringValue:timeStr];
[message addBody:type];
NSString *base64str = [data base64EncodedStringWithOptions:0];
XMPPElement *attachment = [XMPPElement elementWithName:@"attachment" stringValue:base64str];
[message addChild:attachment];
[[XMPPManager sharedmanager].stream sendElement:message];
}
// 手指拖到按鈕外面將要取消錄音
- (IBAction)sendDragOutside:(UIButton *)sender {
NSLog(@"%s, line = %d", __FUNCTION__, __LINE__);
self.voiceView.voiceImageV.image = [UIImage imageNamed:@"QQ20160818-2"];
self.voiceView.voiceTitleLab.text = @"松開手指, 取消發(fā)送";
self.voiceView.voiceTitleLab.backgroundColor = [UIColor colorWithRed:0.826 green:0.0 blue:0.0 alpha:1.0];
}
// 在按鈕外面抬起手指取消錄音
- (IBAction)sendTouchUpOutside:(UIButton *)sender {
NSLog(@"%s, line = %d", __FUNCTION__, __LINE__);
[self.voiceView removeFromSuperview];
// 停止錄音
[self.audioRecorder stop];
// 刪除錄音文件
[self.audioRecorder deleteRecording];
}
// 設(shè)置音頻保存路徑
- (NSURL *)getSavePath {
NSString *urlStr = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject];
urlStr = [urlStr stringByAppendingPathComponent:kRecordAudioFile];
NSURL *url = [NSURL URLWithString:urlStr];
return url;
}
/**
* 設(shè)置音頻會話
注意:一定要添加音頻會話,不然真機(jī)上的錄音時間不對,并且不能進(jìn)行播放音頻
*/
-(void)setAudioSession{
AVAudioSession *audioSession=[AVAudioSession sharedInstance];
//設(shè)置為播放和錄音狀態(tài)烟瞧,以便可以在錄制完之后播放錄音
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil];
[audioSession setActive:YES error:nil];
}
// 錄音文件設(shè)置
- (NSDictionary *)getAudioSetting {
NSMutableDictionary *dicM = [NSMutableDictionary dictionary];
// 設(shè)置錄音格式
[dicM setObject:@(kAudioFormatLinearPCM) forKey:AVFormatIDKey];
// 設(shè)置錄音采樣率,8000是電話采樣率,對于一般錄音已經(jīng)夠了
[dicM setObject:@(8000) forKey:AVSampleRateKey];
// 設(shè)置通道,這里采用單聲道
[dicM setObject:@(1) forKey:AVNumberOfChannelsKey];
// 每個采樣點(diǎn)位數(shù),分別為8,16,24,32
[dicM setObject:@(8) forKey:AVLinearPCMBitDepthKey];
// 是否使用浮點(diǎn)數(shù)采樣
[dicM setObject:@(YES) forKey:AVLinearPCMIsFloatKey];
// ...其他設(shè)置
return dicM;
}
#pragma mark -- AVAudioRecorderDelegate
// 錄音完成
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag {
//NSLog(@"錄音完畢");
}
#pragma mark -- AVAudioPlayerDelegate
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag {
NSLog(@"播放完畢");
}
5.播放語音
if ([chatFrameModel.msg.msg.message.body isEqualToString:@"voice"]) {
if (self.audioPlayer.isPlaying) {
[self.audioPlayer stop];
}
XMPPElement *node = chatFrameModel.msg.msg.message.children.lastObject;
// 取出消息的解碼
NSString *base64str = node.stringValue;
NSData *data = [[NSData alloc]initWithBase64EncodedString:base64str options:0];
self.audioPlayer = [[AVAudioPlayer alloc] initWithData:data error:NULL];
self.audioPlayer.delegate = self;
[self.audioPlayer play];
}
6.發(fā)送圖片信息
XMPPMessage* message = [[XMPPMessage alloc] initWithType:@"chat" to:self.jidChatTo.jid];
// 設(shè)置bodyType類型值為image
//[message addAttributeWithName:@"bodyType" stringValue:type];
#warning 此處采用圖片,語音等傳輸方式為XML攜帶信息,所以傳輸速度慢,因為未搭建文件服務(wù)器,所以采用該種辦法,建議采用文件服務(wù)器方式,傳遞URL
// 該處設(shè)置message的body為文件類型,后面對文件類型的判斷由body來實現(xiàn),其中嘗試使用[message addAttributeWithName:@"bodyType" stringValue:type];來實現(xiàn)文件類型的判斷但未成功,原因暫時未知
[message addBody:type];
// 轉(zhuǎn)換成base64的編碼
NSString *base64str = [data base64EncodedStringWithOptions:0];
// 設(shè)置節(jié)點(diǎn)內(nèi)容
XMPPElement *attachment = [XMPPElement elementWithName:@"attachment" stringValue:base64str];
// 包含子節(jié)點(diǎn)
[message addChild:attachment];
#warning 此處采用的是文件服務(wù)器方式傳輸,但由于未建立文件服務(wù)器,所以傳輸對象為寫死的一圖片,語音等文件的url,僅供測試使用
//[message addBody:@"http://img5.duitang.com/uploads/item/201407/24/20140724054410_5ctE2.jpeg"];
// 發(fā)送圖片消息
[[XMPPManager sharedmanager].stream sendElement:message];
7.會話內(nèi)容回執(zhí)
通過單元格代理方式來獲得會話內(nèi)容回執(zhí)進(jìn)行音頻播放和圖片瀏覽
#pragma mark ******************************
#pragma mark --ChatCellDelegate
- (void)getCurrentChatCell:(ChatTableViewCell *)cell withCurrentChatFrame:(ChatFrameModel *)chatFrameModel {
//NSString *chatType = [chatFrameModel.msg.msg.message attributeStringValueForName:@"bodyType"];
if ([chatFrameModel.msg.msg.message.body isEqualToString:@"image"]) {
MWPhotoBrowser *browser = [[MWPhotoBrowser alloc] initWithDelegate:self];
NSUInteger index = 0;
// 如果當(dāng)前單元格中的url存在,則在數(shù)組中找到與之匹配的,并找出其序號
if (chatFrameModel.msg.msg.body) {
index = [self.chatImageArray indexOfObject:chatFrameModel.msg.msg.body];
}
// 設(shè)置圖片查看器當(dāng)前查看的位置
[browser setCurrentPhotoIndex:index];
// 跳轉(zhuǎn)到圖片查看器
[self.navigationController pushViewController:browser animated:YES];
} else if ([chatFrameModel.msg.msg.message.body isEqualToString:@"voice"]) {
if (self.audioPlayer.isPlaying) {
[self.audioPlayer stop];
}
XMPPElement *node = chatFrameModel.msg.msg.message.children.lastObject;
// 取出消息的解碼
NSString *base64str = node.stringValue;
NSData *data = [[NSData alloc]initWithBase64EncodedString:base64str options:0];
self.audioPlayer = [[AVAudioPlayer alloc] initWithData:data error:NULL];
self.audioPlayer.delegate = self;
[self.audioPlayer play];
}
}
8.鍵盤高度
#pragma mak -- 更改鍵盤高度
- (void)keyboardFrameChange:(NSNotification *)sender {
// 獲得鍵盤改變后的frame
NSValue *keyboardFrame = sender.userInfo[UIKeyboardFrameEndUserInfoKey];
CGRect rect = [keyboardFrame CGRectValue];
CGFloat height = CGRectGetHeight(rect);
// 計算聊天窗口的底部偏移量
if (rect.origin.y == [UIScreen mainScreen].bounds.size.height) {
self.inputViewBottonConstraint.constant = 0;
} else {
self.inputViewBottonConstraint.constant = height;
}
[UIView animateWithDuration:0.25 animations:^{
[self.view layoutIfNeeded];
}];
[self scrollToBottom];
}